Updates from: 06/06/2022 01:08:57
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Azure Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/azure-monitor.md
Previously updated : 02/23/2022 Last updated : 06/03/2022 # Monitor Azure AD B2C with Azure Monitor
-Use Azure Monitor to route Azure Active Directory B2C (Azure AD B2C) sign-in and [auditing](view-audit-logs.md) logs to different monitoring solutions. You can retain the logs for long-term use or integrate with third-party security information and event management (SIEM) tools to gain insights into your environment.
+Use Azure Monitor to route Azure Active Directory B2C (Azure AD B2C) sign in and [auditing](view-audit-logs.md) logs to different monitoring solutions. You can retain the logs for long-term use or integrate with third-party security information and event management (SIEM) tools to gain insights into your environment.
You can route log events to:
You can route log events to:
![Azure Monitor](./media/azure-monitor/azure-monitor-flow.png)
+When you plan to transfer Azure AD B2C logs to different monitoring solutions, or repository, consider that Azure AD B2C logs contain personal data. When you process such data, ensure you use appropriate security measures on the personal data. It includes protection against unauthorized or unlawful processing, using appropriate technical or organizational measures.
+ In this article, you learn how to transfer the logs to an Azure Log Analytics workspace. Then you can create a dashboard or create alerts that are based on Azure AD B2C users' activities.
-> [!IMPORTANT]
-> When you plan to transfer Azure AD B2C logs to different monitoring solutions, or repository, consider the following. Azure AD B2C logs contain personal data. Such data should be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorized or unlawful processing, using appropriate technical or organizational measures.
Watch this video to learn how to configure monitoring for Azure AD B2C using Azure Monitor.
Watch this video to learn how to configure monitoring for Azure AD B2C using Azu
## Deployment overview
-Azure AD B2C leverages [Azure Active Directory monitoring](../active-directory/reports-monitoring/overview-monitoring.md). Because an Azure AD B2C tenant, unlike Azure AD tenants, can't have a subscription associated with it, we need to take some additional steps to enable the integration between Azure AD B2C and Log Analytics, which is where we'll send the logs.
+Azure AD B2C uses [Azure Active Directory monitoring](../active-directory/reports-monitoring/overview-monitoring.md). Unlike Azure AD tenants, an Azure AD B2C tenant can't have a subscription associated with it. So, we need to take extra steps to enable the integration between Azure AD B2C and Log Analytics, which is where we'll send the logs.
To enable _Diagnostic settings_ in Azure Active Directory within your Azure AD B2C tenant, you use [Azure Lighthouse](../lighthouse/overview.md) to [delegate a resource](../lighthouse/concepts/architecture.md), which allows your Azure AD B2C (the **Service Provider**) to manage an Azure AD (the **Customer**) resource. > [!TIP]
To enable _Diagnostic settings_ in Azure Active Directory within your Azure AD B
After you complete the steps in this article, you'll have created a new resource group (here called _azure-ad-b2c-monitor_) and have access to that same resource group that contains the [Log Analytics workspace](../azure-monitor/logs/quick-create-workspace.md) in your **Azure AD B2C** portal. You'll also be able to transfer the logs from Azure AD B2C to your Log Analytics workspace.
-During this deployment, you'll authorize a user or group in your Azure AD B2C directory to configure the Log Analytics workspace instance within the tenant that contains your Azure subscription. To create the authorization, you deploy an [Azure Resource Manager](../azure-resource-manager/index.yml) template to the subscription containing the Log Analytics workspace.
+During this deployment, you'll authorize a user or group in your Azure AD B2C directory to configure the Log Analytics workspace instance within the tenant that contains your Azure subscription. To create the authorization, you deploy an [Azure Resource Manager](../azure-resource-manager/index.yml) template to the subscription that contains the Log Analytics workspace.
The following diagram depicts the components you'll configure in your Azure AD and Azure AD B2C tenants. ![Resource group projection](./media/azure-monitor/resource-group-projection.png)
-During this deployment, you'll configure both your Azure AD B2C tenant and Azure AD tenant where the Log Analytics workspace will be hosted. The Azure AD B2C accounts used (such as your admin account) should be assigned the [Global Administrator](../active-directory/roles/permissions-reference.md#global-administrator) role on the Azure AD B2C tenant. The Azure AD account used to run the deployment must be assigned the [Owner](../role-based-access-control/built-in-roles.md#owner) role in the Azure AD subscription. It's also important to make sure you're signed in to the correct directory as you complete each step as described.
+During this deployment, you'll configure your Azure AD B2C tenant where logs are generated. You'll also configure Azure AD tenant where the Log Analytics workspace will be hosted. The Azure AD B2C accounts used (such as your admin account) should be assigned the [Global Administrator](../active-directory/roles/permissions-reference.md#global-administrator) role on the Azure AD B2C tenant. The Azure AD account you'll use to run the deployment must be assigned the [Owner](../role-based-access-control/built-in-roles.md#owner) role in the Azure AD subscription. It's also important to make sure you're signed in to the correct directory as you complete each step as described.
+
+In summary, you'll use Azure Lighthouse to allow a user or group in your Azure AD B2C tenant to manage a resource group in a subscription associated with a different tenant (the Azure AD tenant). After this authorization is completed, the subscription and log analytics workspace can be selected as a target in the Diagnostic settings in Azure AD B2C.
+
+## Pre-requisites
+
+- An Azure AD B2C account with [Global Administrator](../active-directory/roles/permissions-reference.md#global-administrator) role on the Azure AD B2C tenant.
-In summary, you will use Azure Lighthouse to allow a user or group in your Azure AD B2C tenant to manage a resource group in a subscription associated with a different tenant (the Azure AD tenant). After this authorization is completed, the subscription and log analytics workspace can be selected as a target in the Diagnostic settings in Azure AD B2C.
+- An Azure AD account with the [Owner](../role-based-access-control/built-in-roles.md#owner) role in the Azure AD subscription. See how to [Assign a user as an administrator of an Azure subscription](../role-based-access-control/role-assignments-portal-subscription-admin.md).
## 1. Create or choose resource group
To make management easier, we recommend using Azure AD user _groups_ for each ro
### 3.3 Create an Azure Resource Manager template
-To create the custom authorization and delegation in Azure Lighthouse, we use an Azure Resource Manager template that grants Azure AD B2C access to the Azure AD resource group you created earlier (for example, _azure-ad-b2c-monitor_). Deploy the template from the GitHub sample by using the **Deploy to Azure** button, which opens the Azure portal and lets you configure and deploy the template directly in the portal. For these steps, make sure you're signed in to your Azure AD tenant (not the Azure AD B2C tenant).
+To create the custom authorization and delegation in Azure Lighthouse, we use an Azure Resource Manager template. This template grants Azure AD B2C access to the Azure AD resource group, which you created earlier, for example, _azure-ad-b2c-monitor_. Deploy the template from the GitHub sample by using the **Deploy to Azure** button, which opens the Azure portal and lets you configure and deploy the template directly in the portal. For these steps, make sure you're signed in to your Azure AD tenant (not the Azure AD B2C tenant).
1. Sign in to the [Azure portal](https://portal.azure.com). 1. Make sure you're using the directory that contains your Azure AD tenant. Select the **Directories + subscriptions** icon in the portal toolbar.
To create the custom authorization and delegation in Azure Lighthouse, we use an
| | - | | Subscription | Select the directory that contains the Azure subscription where the _azure-ad-b2c-monitor_ resource group was created. | | Region | Select the region where the resource will be deployed. |
- | Msp Offer Name | A name describing this definition. For example, _Azure AD B2C Monitoring_. This is the name that will be displayed in Azure Lighthouse. The **MSP Offer Name** must be unique in your Azure AD. To monitor multiple Azure AD B2C tenants, use different names. |
+ | Msp Offer Name | A name describing this definition. For example, _Azure AD B2C Monitoring_. It's the name that will be displayed in Azure Lighthouse. The **MSP Offer Name** must be unique in your Azure AD. To monitor multiple Azure AD B2C tenants, use different names. |
| Msp Offer Description | A brief description of your offer. For example, _Enables Azure Monitor in Azure AD B2C_. | | Managed By Tenant Id | The **Tenant ID** of your Azure AD B2C tenant (also known as the directory ID). | | Authorizations | Specify a JSON array of objects that include the Azure AD `principalId`, `principalIdDisplayName`, and Azure `roleDefinitionId`. The `principalId` is the **Object ID** of the B2C group or user that will have access to resources in this Azure subscription. For this walkthrough, specify the group's Object ID that you recorded earlier. For the `roleDefinitionId`, use the [built-in role](../role-based-access-control/built-in-roles.md) value for the _Contributor role_, `b24988ac-6180-42a0-ab88-20f7382dd24c`. |
After you've deployed the template and waited a few minutes for the resource pro
> [!NOTE] > On the **Portal settings | Directories + subscriptions** page, ensure that your Azure AD B2C and Azure AD tenants are selected under **Current + delegated directories**.
-1. Sign out of the Azure portal if you're currently signed in (this allows your session credentials to be refreshed in the next step).
-1. Sign in to the [Azure portal](https://portal.azure.com) with your **Azure AD B2C** administrative account. This account must be a member of the security group you specified in the [Delegate resource management](#3-delegate-resource-management) step.
+1. Sign out of the [Azure portal](https://portal.azure.com) and sign back in with your **Azure AD B2C** administrative account. This account must be a member of the security group you specified in the [Delegate resource management](#3-delegate-resource-management) step. Signing out and singing back in allows your session credentials to be refreshed in the next step.
1. Select the **Directories + subscriptions** icon in the portal toolbar. 1. On the **Portal settings | Directories + subscriptions** page, in the **Directory name** list, find your Azure AD directory that contains the Azure subscription and the _azure-ad-b2c-monitor_ resource group you created, and then select **Switch**. 1. Verify that you've selected the correct directory and your Azure subscription is listed and selected in the **Default subscription filter**.
To configure monitoring settings for Azure AD B2C activity logs:
> [!NOTE] > It can take up to 15 minutes after an event is emitted for it to [appear in a Log Analytics workspace](../azure-monitor/logs/data-ingestion-time.md). Also, learn more about [Active Directory reporting latencies](../active-directory/reports-monitoring/reference-reports-latencies.md), which can impact the staleness of data and play an important role in reporting.
-If you see the error message "To set up Diagnostic settings to use Azure Monitor for your Azure AD B2C directory, you need to set up delegated resource management," make sure you sign in with a user who is a member of the [security group](#32-select-a-security-group) and [select your subscription](#4-select-your-subscription).
+If you see the error message, _To set up Diagnostic settings to use Azure Monitor for your Azure AD B2C directory, you need to set up delegated resource management_, make sure you sign in with a user who is a member of the [security group](#32-select-a-security-group) and [select your subscription](#4-select-your-subscription).
## 6. Visualize your data
Now you can configure your Log Analytics workspace to visualize your data and co
### 6.1 Create a Query
-Log queries help you to fully leverage the value of the data collected in Azure Monitor Logs. A powerful query language allows you to join data from multiple tables, aggregate large sets of data, and perform complex operations with minimal code. Virtually any question can be answered and analysis performed as long as the supporting data has been collected, and you understand how to construct the right query. For more information, see [Get started with log queries in Azure Monitor](../azure-monitor/logs/get-started-queries.md).
+Log queries help you to fully use the value of the data collected in Azure Monitor Logs. A powerful query language allows you to join data from multiple tables, aggregate large sets of data, and perform complex operations with minimal code. Virtually any question can be answered and analysis performed as long as the supporting data has been collected, and you understand how to construct the right query. For more information, see [Get started with log queries in Azure Monitor](../azure-monitor/logs/get-started-queries.md).
1. From **Log Analytics workspace**, select **Logs** 1. In the query editor, paste the following [Kusto Query Language](/azure/data-explorer/kusto/query/) query. This query shows policy usage by operation over the past x days. The default duration is set to 90 days (90d). Notice that the query is focused only on the operation where a token/code is issued by policy.
The workbook will display reports in the form of a dashboard.
## Create alerts
-Alerts are created by alert rules in Azure Monitor and can automatically run saved queries or custom log searches at regular intervals. You can create alerts based on specific performance metrics or when certain events are created, absence of an event, or a number of events are created within a particular time window. For example, alerts can be used to notify you when average number of sign-in exceeds a certain threshold. For more information, see [Create alerts](../azure-monitor/alerts/alerts-log.md).
+Alerts are created by alert rules in Azure Monitor and can automatically run saved queries or custom log searches at regular intervals. You can create alerts based on specific performance metrics or when certain events occur. You can also create alerts on absence of an event, or a number of events are occur within a particular time window. For example, alerts can be used to notify you when average number of sign in exceeds a certain threshold. For more information, see [Create alerts](../azure-monitor/alerts/alerts-log.md).
-Use the following instructions to create a new Azure Alert, which will send an [email notification](../azure-monitor/alerts/action-groups.md#configure-notifications) whenever there is a 25% drop in the **Total Requests** compare to previous period. Alert will run every 5 minutes and look for the drop in the last hour compared to the hour before that. The alerts are created using Kusto query language.
+Use the following instructions to create a new Azure Alert, which will send an [email notification](../azure-monitor/alerts/action-groups.md#configure-notifications) whenever there's a 25% drop in the **Total Requests** compared to previous period. Alert will run every 5 minutes and look for the drop in the last hour compared to the hour before it. The alerts are created using Kusto query language.
1. From **Log Analytics workspace**, select **Logs**. 1. Create a new **Kusto query** by using the query below.
After the alert is created, go to **Log Analytics workspace** and select **Alert
Azure Monitor and Service Health alerts use action groups to notify users that an alert has been triggered. You can include sending a voice call, SMS, email; or triggering various types of automated actions. Follow the guidance [Create and manage action groups in the Azure portal](../azure-monitor/alerts/action-groups.md)
-Here is an example of an alert notification email.
+Here's an example of an alert notification email.
![Email notification](./media/azure-monitor/alert-email-notification.png)
active-directory Recover From Deletions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/recover-from-deletions.md
The most frequent scenarios for application deletion are:
-When you delete an application, the application registration by default enters the soft-delete state. To understand the relationship between application registrations and service principals, see [Apps & service principals in Azure AD - Microsoft identity platform](../develop/app-objects-and-service-principals.md).
+When you delete an application, the application registration by default enters the soft-delete state. To understand the relationship between application registrations and service principals, see [Apps & service principals in Azure AD - Microsoft identity platform](/azure/active-directory/develop/app-objects-and-service-principals).
For details on restoring soft deleted Microsoft 365 Groups, see the following do
### Applications
-Applications have two objects, the application registration and the service principle. For more information on the differences between the registration and the service principal, see [Apps & service principals in Azure AD.](/develop/app-objects-and-service-principals.md)
+Applications have two objects, the application registration and the service principle. For more information on the differences between the registration and the service principal, see [Apps & service principals in Azure AD.](/azure/active-directory/develop/app-objects-and-service-principals)
To restore an application from the Azure portal, select App registrations, then deleted applications. Select the application registration to restore, and then select Restore app registration.
active-directory Access Reviews Application Preparation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/access-reviews-application-preparation.md
The integration patterns listed above are applicable to third party SaaS applica
Now that you have identified the integration pattern for the application, check the application as represented in Azure AD is ready for review. 1. In the Azure portal, click **Azure Active Directory**, click **Enterprise Applications**, and check whether your application is on the [list of enterprise applications](../manage-apps/view-applications-portal.md) in your Azure AD tenant.
-1. If the application is not already listed, then check if the application is available the [application gallery](../manage-apps/overview-application-gallery.md) for applications that can be integrated for federated SSO or provisioning. If it is in the gallery, then use the [tutorials](../saas-apps/tutorial-list.md) to configure the application for federation, and if it supports provisioning, also [configure the application](/app-provisioning/configure-automatic-user-provisioning-portal.md) for provisioning.
+1. If the application is not already listed, then check if the application is available the [application gallery](../manage-apps/overview-application-gallery.md) for applications that can be integrated for federated SSO or provisioning. If it is in the gallery, then use the [tutorials](../saas-apps/tutorial-list.md) to configure the application for federation, and if it supports provisioning, also [configure the application](/azure/active-directory/app-provisioning/configure-automatic-user-provisioning-portal) for provisioning.
1. One the application is in the list of enterprise applications in your tenant, select the application from the list. 1. Change to the **Properties** tab. Verify that the **User assignment required?** option is set to **Yes**. If it's set to **No**, all users in your directory, including external identities, can access the application, and you can't review access to the application.
active-directory Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/overview.md
# What are managed identities for Azure resources?
-A common challenge for developers is the management of secrets and credentials used to secure communication between different components making up a solution. Managed identities eliminate the need for developers to manage credentials.
+A common challenge for developers is the management of secrets, credentials, certificates, keys etc used to secure communication between services. Managed identities eliminate the need for developers to manage these credentials.
-Managed identities provide an identity for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication. Applications may use the managed identity to obtain Azure AD tokens. With [Azure Key Vault](../../key-vault/general/overview.md), developers can use managed identities to access resources. Key Vault stores credentials in a secure manner and gives access to storage accounts.
+While developers can securely store the secrets in [Azure Key Vault](../../key-vault/general/overview.md), services need a way to access Azure Key Vault. Managed identities provide an automatically managed identity in Azure Active Directory for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication. Applications can use managed identities to obtain Azure AD tokens without having manage any credetials.
The following video shows how you can use managed identities:</br>
The following table shows the differences between the two types of managed ident
## How can I use managed identities for Azure resources?
-[![This flowchart shows examples of how a developer may use managed identities to get access to resources from their code without managing authentication information.](media/overview/when-use-managed-identities.png)](media/overview/when-use-managed-identities.png#lightbox)
+For using Managed identities, you have should do the following:
+1. Create a managed identity in Azure. You can choose between system-assigned managed identity or user-assigned managed identity.
+2. In case of user-assigned managed identity, assign the managed identity to the "source" Azure Resource, such as an Azure Logic App or an Azure Web App.
+3. Authorize the managed identity to have accees to the "target" service.
+4. Use the managed identity to perform access. For this, you can use the Azure SDK with the Azure.Identity library. Some "source" resources offer connectors that know how to use Managed identities for the connections. In that case you simply use the ideantity as a feature of that "source" resource.
+ ## What Azure services support the feature?<a name="which-azure-services-support-managed-identity"></a>
aks Dapr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/dapr.md
az k8s-extension delete --resource-group myResourceGroup --cluster-name myAKSClu
[az-provider-register]: /cli/azure/provider#az-provider-register [sample-application]: ./quickstart-dapr.md [k8s-version-support-policy]: ./supported-kubernetes-versions.md?tabs=azure-cli#kubernetes-version-support-policy
-[arc-k8s-cluster]: ../azure-arc/kubernetes/quickstart-connect-cluster.md
+[arc-k8s-cluster]: /azure/azure-arc/kubernetes/quickstart-connect-cluster
[update-extension]: ./cluster-extensions.md#update-extension-instance [install-cli]: /cli/azure/install-azure-cli
az k8s-extension delete --resource-group myResourceGroup --cluster-name myAKSClu
[dapr-oss-support]: https://docs.dapr.io/operations/support/support-release-policy/ [dapr-supported-version]: https://docs.dapr.io/operations/support/support-release-policy/#supported-versions [dapr-troubleshooting]: https://docs.dapr.io/operations/troubleshooting/common_issues/
-[supported-cloud-regions]: https://azure.microsoft.com/global-infrastructure/services/?products=azure-arc
+[supported-cloud-regions]: https://azure.microsoft.com/global-infrastructure/services/?products=azure-arc
app-service Quickstart Python Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-python-portal.md
Having issues? [Let us know](https://aka.ms/FlaskCLIQuickstartHelp).
## Next steps > [!div class="nextstepaction"]
-> [Tutorial: Python (Django) web app with PostgreSQL](/azure/developer/python/tutorial-python-postgresql-app-portal)
+> [Tutorial: Python (Django) web app with PostgreSQL](/azure/app-service/tutorial-python-postgresql-app)
> [!div class="nextstepaction"] > [Configure Python app](configure-language-python.md)
app-service Troubleshoot Dotnet Visual Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/troubleshoot-dotnet-visual-studio.md
The tutorial assumes you're using Visual Studio 2019.
The streaming logs feature only works for applications that target .NET Framework 4 or later. ## <a name="sitemanagement"></a>App configuration and management
-Visual Studio provides access to a subset of the app management functions and configuration settings available in the [Azure portal](https://go.microsoft.com/fwlink/?LinkId=529715). In this section, you'll see what's available by using **Server Explorer**. To see the latest Azure integration features, try out **Cloud Explorer** also. You can open both windows from the **View** menu.
+Visual Studio provides access to a subset of the app management functions and configuration settings available in the [Azure portal](/rest/api/appservice/web-apps). In this section, you'll see what's available by using **Server Explorer**. To see the latest Azure integration features, try out **Cloud Explorer** also. You can open both windows from the **View** menu.
1. If you aren't already signed in to Azure in Visual Studio, right-click **Azure** and select Connect to **Microsoft Azure Subscription** in **Server Explorer**.
For more information about analyzing web server logs, see the following resource
The Microsoft TechNet website includes a [Using Failed Request Tracing](https://www.iis.net/learn/troubleshoot/using-failed-request-tracing) section, which may be helpful for understanding how to use these logs. However, this documentation focuses mainly on configuring failed request tracing in IIS, which you can't do in Azure App Service. [GetStarted]: quickstart-dotnetcore.md?pivots=platform-windows
-[GetStartedWJ]: https://github.com/Azure/azure-webjobs-sdk/wiki
+[GetStartedWJ]: https://github.com/Azure/azure-webjobs-sdk/wiki
azure-australia Azure Key Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-australia/azure-key-vault.md
Access to Key Vaults should be explicitly restricted to the minimum set of netwo
Key Vault supports BYOK. BYOK enables users to import keys from their existing key infrastructures. The BYOK toolset supports the secure transfer and import of keys from an external HSM (for example, keys generated with an offline workstation) into Key Vault.
-Go to the Microsoft Download Center and [download the Azure Key Vault BYOK toolset](https://www.microsoft.com/download/details.aspx?id=45345) for Australia. The package name to download and its corresponding SHA-256 package hash are:
-
-|Package Name|SHA-256 Hash|
-|||
-|KeyVault-BYOK-Tools-Australia.zip|CD0FB7365053DEF8C35116D7C92D203C64A3D3EE2452A025223EEB166901C40A|
-|
- ### Key Vault auditing and logging The ACSC requires Commonwealth entities to use the appropriate Azure services to undertake real-time monitoring and reporting on their Azure workloads.
SSE is used for managed disks but customer-managed keys are not supported. Encr
Review the article on [Identity Federation](identity-federation.md)
-Review additional Azure Key Vault documentation and tutorials in the [Reference Library](reference-library.md)
+Review additional Azure Key Vault documentation and tutorials in the [Reference Library](reference-library.md)
azure-australia Gateway Egress Traffic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-australia/gateway-egress-traffic.md
The following key requirements for controlling egress traffic in Azure have been
Description|Source |
-**Implement Network Segmentation and Segregation**, for example, use an n-tier architecture, using host-based firewalls and network access controls to limit inbound and outbound network connectivity to only required ports and protocols.| [Cloud Computing for Tenants](https://acsc.gov.au/publications/protect/cloud-security-tenants.htm)
-**Implement adequately high bandwidth, low latency, reliable network connectivity** between the tenant (including the tenant's remote users) and the cloud service to meet the tenant's availability requirements | [ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
-**Apply technologies at more than just the network layer**. Each host and network should be segmented and segregated, where possible, at the lowest level that can be practically managed. In most cases, this applies from the data link layer up to and including the application layer; however, in sensitive environments, physical isolation may be appropriate. Host-based and network-wide measures should be deployed in a complementary manner and be centrally monitored. Just implementing a firewall or security appliance as the only security measure is not sufficient. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
-**Use the principles of least privilege and needΓÇÉtoΓÇÉknow**. If a host, service, or network doesn't need to communicate with another host, service, or network, it should not be allowed to. If a host, service, or network only needs to talk to another host, service, or network on a specific port or protocol, it should be restricted to only those ports and protocols. Adopting these principles across a network will complement the minimisation of user privileges and significantly increase the overall security posture of the environment. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
-**Separate hosts and networks based on their sensitivity or criticality to business operations**. This may include using different hardware or platforms depending on different security classifications, security domains, or availability/integrity requirements for certain hosts or networks. In particular, separate management networks and consider physically isolating out-of-band management networks for sensitive environments. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
-**Identify, authenticate, and authorise access by all entities to all other entities**. All users, hosts, and services should have their access to all other users, hosts, and services restricted to only those required to perform their designated duties or functions. All legacy or local services which bypass or downgrade the strength of identification, authentication, and authorisation services should be disabled wherever possible and have their use closely monitored. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
-**Implement allowlisting of network traffic instead of deny listing**. Only permit access for known good network traffic (traffic that is identified, authenticated, and authorised), rather than denying access to known bad network traffic (for example, blocking a specific address or service). Allowlists result in a superior security policy to deny lists, and significantly improve your capacity to detect and assess potential network intrusions. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
+**Implement Network Segmentation and Segregation**, for example, use an n-tier architecture, using host-based firewalls and network access controls to limit inbound and outbound network connectivity to only required ports and protocols.| [Cloud Computing for Tenants](https://www.cyber.gov.au/acsc/view-all-content/publications/cloud-computing-security-tenants)
+**Implement adequately high bandwidth, low latency, reliable network connectivity** between the tenant (including the tenant's remote users) and the cloud service to meet the tenant's availability requirements | [ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
+**Apply technologies at more than just the network layer**. Each host and network should be segmented and segregated, where possible, at the lowest level that can be practically managed. In most cases, this applies from the data link layer up to and including the application layer; however, in sensitive environments, physical isolation may be appropriate. Host-based and network-wide measures should be deployed in a complementary manner and be centrally monitored. Just implementing a firewall or security appliance as the only security measure is not sufficient. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
+**Use the principles of least privilege and needΓÇÉtoΓÇÉknow**. If a host, service, or network doesn't need to communicate with another host, service, or network, it should not be allowed to. If a host, service, or network only needs to talk to another host, service, or network on a specific port or protocol, it should be restricted to only those ports and protocols. Adopting these principles across a network will complement the minimisation of user privileges and significantly increase the overall security posture of the environment. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
+**Separate hosts and networks based on their sensitivity or criticality to business operations**. This may include using different hardware or platforms depending on different security classifications, security domains, or availability/integrity requirements for certain hosts or networks. In particular, separate management networks and consider physically isolating out-of-band management networks for sensitive environments. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
+**Identify, authenticate, and authorise access by all entities to all other entities**. All users, hosts, and services should have their access to all other users, hosts, and services restricted to only those required to perform their designated duties or functions. All legacy or local services which bypass or downgrade the strength of identification, authentication, and authorisation services should be disabled wherever possible and have their use closely monitored. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
+**Implement allowlisting of network traffic instead of deny listing**. Only permit access for known good network traffic (traffic that is identified, authenticated, and authorised), rather than denying access to known bad network traffic (for example, blocking a specific address or service). Allowlists result in a superior security policy to deny lists, and significantly improve your capacity to detect and assess potential network intrusions. |[ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
**Defining an allowlist of permitted websites and blocking all unlisted websites** effectively removes one of the most common data delivery and exfiltration techniques used by an adversary. If users have a legitimate requirement to access numerous websites, or a rapidly changing list of websites; you should consider the costs of such an implementation. Even a relatively permissive allowlist offers better security than relying on deny lists, or no restrictions at all, while still reducing implementation costs. An example of a permissive allowlist could be permitting the entire Australian subdomain, that is '*.au', or allowing the top 1,000 sites from the Alexa site ranking (after filtering Dynamic Domain Name System (DDNS) domains and other inappropriate domains).| [Australian Government Information Security Manual (ISM)](https://www.cyber.gov.au/ism) |
PaaS resources deployed into a virtual network receive dedicated IP addresses an
## General guidance
-To design and build secure solutions within Azure, it is critical to understand and control the network traffic so that only identified and authorised communication can occur. The intent of this guidance and the specific component guidance in later sections is to describe the tools and services that can be utilised to apply the principles outlined in the [ACSC Protect: Implementing Network Segmentation and Segregation](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm) across Azure workloads. This includes detailing how to create a virtual architecture for securing resources when it is not possible to apply the same traditional physical and network controls that are possible in an on-premises environment.
+To design and build secure solutions within Azure, it is critical to understand and control the network traffic so that only identified and authorised communication can occur. The intent of this guidance and the specific component guidance in later sections is to describe the tools and services that can be utilised to apply the principles outlined in the [ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation) across Azure workloads. This includes detailing how to create a virtual architecture for securing resources when it is not possible to apply the same traditional physical and network controls that are possible in an on-premises environment.
### Guidance
Item | Link
--| _Australian Regulatory and Policy Compliance Documents including Consumer Guidance_ | [https://aka.ms/au-irap](https://aka.ms/au-irap) _Azure Virtual Data Centre_ | [https://docs.microsoft.com/azure/architecture/vdc/networking-virtual-datacenter](/azure/architecture/vdc/networking-virtual-datacenter)
-_ACSC Network Segmentation_ | [https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)
-_ACSC Cloud Security for Tenants_ | [https://acsc.gov.au/publications/protect/cloud-security-tenants.htm](https://acsc.gov.au/publications/protect/cloud-security-tenants.htm)
+_ACSC Network Segmentation_ | [https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)
+_ACSC Cloud Security for Tenants_ | [https://www.cyber.gov.au/acsc/view-all-content/publications/cloud-computing-security-tenants](https://www.cyber.gov.au/acsc/view-all-content/publications/cloud-computing-security-tenants)
_ACSC Information Security Manual_ | [https://acsc.gov.au/infosec/ism/index.htm](https://acsc.gov.au/infosec/ism/index.htm) |
Public IP addresses for PaaS capabilities are allocated based on the region wher
## Next steps
-Compare your overall architecture and design to the published [PROTECTED Blueprints for IaaS and PaaS Web Applications](https://aka.ms/au-protected).
+Compare your overall architecture and design to the published [PROTECTED Blueprints for IaaS and PaaS Web Applications](https://aka.ms/au-protected).
azure-australia Gateway Ingress Traffic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-australia/gateway-ingress-traffic.md
The network controls align with the Australian Cyber Security Centre (ACSC) Cons
## Requirements
-The overall security requirements for Commonwealth systems are defined in the ISM. To assist Commonwealth entities in implementing network security, the ACSC has published [ACSC Protect: Implementing Network Segmentation and Segregation](https://www.acsc.gov.au/publications/protect/network_segmentation_segregation.htm), and to assist with securing systems in Cloud environments the ACSC has published [Cloud Computing Security for Tenants](https://www.cyber.gov.au/publications/cloud-computing-security-for-tenants).
+The overall security requirements for Commonwealth systems are defined in the ISM. To assist Commonwealth entities in implementing network security, the ACSC has published [ACSC Protect: Implementing Network Segmentation and Segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation), and to assist with securing systems in Cloud environments the ACSC has published [Cloud Computing Security for Tenants](https://www.cyber.gov.au/publications/cloud-computing-security-for-tenants).
These guides outline the context for implementing network security and controlling traffic and include practical recommendations for network design and configuration.
To design and build secure solutions within Azure, it is critical to understand
||| |Australian Regulatory and Policy Compliance Documents including Consumer Guidance|[https://aka.ms/au-irap](https://aka.ms/au-irap)| |Azure Virtual Data Center|[https://docs.microsoft.com/azure/architecture/vdc/networking-virtual-datacenter](/azure/architecture/vdc/networking-virtual-datacenter)|
-|ACSC Network Segmentation|[https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm](https://acsc.gov.au/publications/protect/network_segmentation_segregation.htm)|
-|ACSC Cloud Security for Tenants| [https://acsc.gov.au/publications/protect/cloud-security-tenants.htm](https://acsc.gov.au/publications/protect/cloud-security-tenants.htm)|
+|ACSC Network Segmentation|[https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation](https://www.cyber.gov.au/acsc/view-all-content/publications/implementing-network-segmentation-and-segregation)|
+|ACSC Cloud Security for Tenants| [https://www.cyber.gov.au/acsc/view-all-content/publications/cloud-computing-security-tenants](https://www.cyber.gov.au/acsc/view-all-content/publications/cloud-computing-security-tenants)|
|ACSC Information Security Manual|[https://acsc.gov.au/infosec/ism/index.htm](https://acsc.gov.au/infosec/ism/index.htm)| ## Component guidance
Azure Policy is a key component for enforcing and maintaining the integrity of t
## Next steps
-Review the article on [Gateway Egress Traffic Management and Control](gateway-egress-traffic.md) for details on managing traffic flows from your Azure environment to other networks using your Gateway components in Azure.
+Review the article on [Gateway Egress Traffic Management and Control](gateway-egress-traffic.md) for details on managing traffic flows from your Azure environment to other networks using your Gateway components in Azure.
azure-australia Gateway Secure Remote Administration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-australia/gateway-secure-remote-administration.md
This article provides information on implementing a secure remote administration
## Australian Cyber Security Centre (ACSC) requirements
-The overall security requirements for Commonwealth systems are defined in the ISM. To assist Commonwealth entities in providing secure administration, the ACSC has published [ACSC Protect: Secure Administration](https://www.acsc.gov.au/publications/protect/secure-administration.htm)
+The overall security requirements for Commonwealth systems are defined in the ISM. To assist Commonwealth entities in providing secure administration, the ACSC has published [ACSC Protect: Secure Administration](https://www.cyber.gov.au/acsc/view-all-content/publications/secure-administration)
This document discusses the importance of secure administration and suggests one method of implementing a secure administration environment. The document describes the elements of a secure administration solution as follows:
This article provides a reference architecture for how the elements above can be
## Architecture
-Providing a secure administration capability requires multiple components that all work together to form a cohesive solution. In the reference architecture provided, the components are mapped to the elements described in [ACSC Protect: Secure Administration](https://www.acsc.gov.au/publications/protect/secure-administration.htm)
+Providing a secure administration capability requires multiple components that all work together to form a cohesive solution. In the reference architecture provided, the components are mapped to the elements described in [ACSC Protect: Secure Administration](https://www.cyber.gov.au/acsc/view-all-content/publications/secure-administration)
![Azure Secure Remote Administration Architecture](media/remote-admin.png)
When implementing the components listed in this article, the following general g
|Australian Regulatory and Policy Compliance Documents|[Australian Regulatory and Policy Compliance Documents](https://aka.ms/au-irap)| |Azure products - Australian regions and non-regional|[Azure products - Australian regions and non-regional](https://azure.microsoft.com/global-infrastructure/services/?regions=non-regional,australia-central,australia-central-2,australia-east,australia-southeast)| |Strategies to Mitigate Cyber Security Incidents|[Strategies to Mitigate Cyber Security Incidents](https://acsc.gov.au/infosec/mitigationstrategies.htm)|
-|ACSC Protect: Secure Administration|[ACSC Protect: Secure Administration](https://acsc.gov.au/publications/protect/secure-administration.htm)|
+|ACSC Protect: Secure Administration|[ACSC Protect: Secure Administration](https://www.cyber.gov.au/acsc/view-all-content/publications//secure-administration)|
|How To: Integrate your Remote Desktop Gateway infrastructure using the Network Policy Server (NPS) extension and Azure AD|[Integrate RD Gateway with NPS and Azure AD](../active-directory/authentication/howto-mfa-nps-extension-rdg.md)| ## Component guidance
azure-fluid-relay Connect Fluid Azure Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-fluid-relay/how-tos/connect-fluid-azure-service.md
Now that you have an instance of `AzureClient`, you can start using it to create
### Token providers
-The [AzureFunctionTokenProvider](https://github.com/microsoft/FluidFramework/blob/main/packages/framework/azure-client/src/AzureFunctionTokenProvider.ts) is an implementation of `ITokenProvider` which ensures your tenant key secret is not exposed in your client-side bundle code. The `AzureFunctionTokenProvider` takes in your Azure Function URL appended by `/api/GetAzureToken` along with the current user object. Later on, it makes a `GET` request to your Azure Function by passing in the tenantId, documentId and userId/userName as optional parameters.
+The [AzureFunctionTokenProvider](https://github.com/microsoft/FluidFramework/blob/main/azure/packages/azure-client/src/AzureFunctionTokenProvider.ts) is an implementation of `ITokenProvider` which ensures your tenant key secret is not exposed in your client-side bundle code. The `AzureFunctionTokenProvider` takes in your Azure Function URL appended by `/api/GetAzureToken` along with the current user object. Later on, it makes a `GET` request to your Azure Function by passing in the tenantId, documentId and userId/userName as optional parameters.
```javascript const config = {
azure-percept Overview Ai Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-percept/overview-ai-models.md
With pre-trained models, no coding or training data collection is required. Simp
## Reference solutions
-A [people counting reference solution](https://github.com/microsoft/Azure-Percept-Reference-Solutions/tree/main/people-detection-app) is also available. This reference solution is an open-source AI application providing edge-based people counting with user-defined zone entry/exit events. Video and AI output from the on-premise edge device is egressed to [Azure Data Lake](https://azure.microsoft.com/solutions/data-lake/), with the user interface running as an Azure website. AI inferencing is provided by an open-source AI model for people detection.
+A people counting reference solution is also available. This reference solution is an open-source AI application providing edge-based people counting with user-defined zone entry/exit events. Video and AI output from the on-premise edge device is egressed to [Azure Data Lake](https://azure.microsoft.com/solutions/data-lake/), with the user interface running as an Azure website. AI inferencing is provided by an open-source AI model for people detection.
:::image type="content" source="./media/overview-ai-models/people-detector.gif" alt-text="Spatial analytics pre-built solution gif.":::
azure-percept Voice Control Your Inventory Then Visualize With Power Bi Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-percept/voice-control-your-inventory-then-visualize-with-power-bi-dashboard.md
In this tutorial, you learn how to:
- The [Azure Functions Core Tools](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/azure-functions/functions-run-local.md) version 3.x. - The [Python extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for Visual Studio Code. - The [Azure Functions extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions) for Visual Studio Code.-- Create an [Azure SQL server](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/azure-sql/database/single-database-create-quickstart.md)
+- Create an [Azure SQL server](/azure/azure-sql/database/single-database-create-quickstart)
## Software architecture
cloud-services Cloud Services Dotnet Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services/cloud-services-dotnet-get-started.md
The application is an advertising bulletin board. Users create an ad by entering
The application uses the [queue-centric work pattern](https://www.asp.net/aspnet/overview/developing-apps-with-windows-azure/building-real-world-cloud-apps-with-windows-azure/queue-centric-work-pattern) to off-load the CPU-intensive work of creating thumbnails to a back-end process. ## Alternative architecture: App Service and WebJobs
-This tutorial shows how to run both front-end and back-end in an Azure cloud service. An alternative is to run the front-end in [Azure App Service](../app-service/index.yml) and use the [WebJobs](https://go.microsoft.com/fwlink/?LinkId=390226) feature for the back-end. For a tutorial that uses WebJobs, see [Get Started with the Azure WebJobs SDK](https://github.com/Azure/azure-webjobs-sdk/wiki). For information about how to choose the services that best fit your scenario, see [Azure App Service, Cloud Services, and virtual machines comparison](/azure/architecture/guide/technology-choices/compute-decision-tree).
+This tutorial shows how to run both front-end and back-end in an Azure cloud service. An alternative is to run the front-end in [Azure App Service](../app-service/index.yml) and use the [WebJobs](/azure/app-service/webjobs-create) feature for the back-end. For a tutorial that uses WebJobs, see [Get Started with the Azure WebJobs SDK](https://github.com/Azure/azure-webjobs-sdk/wiki). For information about how to choose the services that best fit your scenario, see [Azure App Service, Cloud Services, and virtual machines comparison](/azure/architecture/guide/technology-choices/compute-decision-tree).
## What you'll learn * How to enable your machine for Azure development by installing the Azure SDK.
cloud-services Cloud Services Nodejs Chat App Socketio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services/cloud-services-nodejs-chat-app-socketio.md
Azure emulator:
> [!IMPORTANT] > Be sure to use a unique name, otherwise the publish process will fail. After the deployment has completed, the browser will open and navigate to the deployed service. >
- > If you receive an error stating that the provided subscription name doesn't exist in the imported publish profile, you must download and import the publishing profile for your subscription before deploying to Azure. See the **Deploying the Application to Azure** section of [Build and deploy a Node.js application to an Azure Cloud Service](https://azure.microsoft.com/develop/nodejs/tutorials/getting-started/)
+ > If you receive an error stating that the provided subscription name doesn't exist in the imported publish profile, you must download and import the publishing profile for your subscription before deploying to Azure. See the **Deploying the Application to Azure** section of [Build and deploy a Node.js application to an Azure Cloud Service](/azure/cloud-services/cloud-services-nodejs-develop-deploy-app)
> > ![A browser window displaying the service hosted on Azure][completed-app] > [!NOTE]
- > If you receive an error stating that the provided subscription name doesn't exist in the imported publish profile, you must download and import the publishing profile for your subscription before deploying to Azure. See the **Deploying the Application to Azure** section of [Build and deploy a Node.js application to an Azure Cloud Service](https://azure.microsoft.com/develop/nodejs/tutorials/getting-started/)
+ > If you receive an error stating that the provided subscription name doesn't exist in the imported publish profile, you must download and import the publishing profile for your subscription before deploying to Azure. See the **Deploying the Application to Azure** section of [Build and deploy a Node.js application to an Azure Cloud Service](/azure/cloud-services/cloud-services-nodejs-develop-deploy-app)
> >
cloud-services Cloud Services Python How To Use Service Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cloud-services/cloud-services-python-how-to-use-service-management.md
To use the Service Management API, you need to [create an Azure account](https:/
The Azure SDK for Python wraps the [Service Management API][svc-mgmt-rest-api], which is a REST API. All API operations are performed over TLS and mutually authenticated by using X.509 v3 certificates. The management service can be accessed from within a service running in Azure. It also can be accessed directly over the Internet from any application that can send an HTTPS request and receive an HTTPS response. ## <a name="Installation"> </a>Installation
-All the features described in this article are available in the `azure-servicemanagement-legacy` package, which you can install by using pip. For more information about installation (for example, if you're new to Python), see [Install Python and the Azure SDK](/azure/developer/python/azure-sdk-install).
+All the features described in this article are available in the `azure-servicemanagement-legacy` package, which you can install by using pip. For more information about installation (for example, if you're new to Python), see [Install Python and the Azure SDK](/azure/developer/python/sdk/azure-sdk-install).
## <a name="Connect"> </a>Connect to service management To connect to the service management endpoint, you need your Azure subscription ID and a valid management certificate. You can obtain your subscription ID through the [Azure portal][management-portal].
For more information, see the [Python Developer Center](https://azure.microsoft.
[svc-mgmt-rest-api]: /previous-versions/azure/ee460799(v=azure.100)
-[cloud service]:/azure/cloud-services/
+[cloud service]:/azure/cloud-services/
container-instances Container Instances Using Azure Container Registry https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-using-azure-container-registry.md
## Limitations
-* The [Azure Container Registry](../container-registry/container-registry-vnet.md) must have [Public Access set to 'All Networks'](../container-registry/container-registry-access-selected-networks.md). To use an Azure container registry with Public Access set to 'Select Networks' or 'None', visit [ACI's article for using Managed-Identity based authentication with ACR](/using-azure-container-registry-mi.md).
+* The [Azure Container Registry](../container-registry/container-registry-vnet.md) must have [Public Access set to 'All Networks'](../container-registry/container-registry-access-selected-networks.md). To use an Azure container registry with Public Access set to 'Select Networks' or 'None', visit [ACI's article for using Managed-Identity based authentication with ACR](/azure/container-registry/container-registry-authentication-managed-identity).
## Configure registry authentication
cosmos-db Kafka Connect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cassandra/kafka-connect.md
Title: Integrate Apache Kafka and Azure Cosmos DB Cassandra API using Kafka Connect description: Learn how to ingest data from Kafka to Azure Cosmos DB Cassandra API using DataStax Apache Kafka Connector-+ Last updated 12/14/2020-+
cosmos-db Manage Data Go https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cassandra/manage-data-go.md
Title: Build a Go app with Azure Cosmos DB Cassandra API using the gocql client description: This quickstart shows how to use a Go client to interact with Azure Cosmos DB Cassandra API --+++ ms.devlang: golang
cosmos-db Postgres Migrate Cosmos Db Kafka https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cassandra/postgres-migrate-cosmos-db-kafka.md
Title: Migrate data from PostgreSQL to Azure Cosmos DB Cassandra API account using Apache Kafka description: Learn how to use Kafka Connect to synchronize data from PostgreSQL to Azure Cosmos DB Cassandra API in real time.-+ Last updated 04/02/2022-+
cosmos-db Change Feed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/change-feed.md
Title: Working with the change feed support in Azure Cosmos DB description: Use Azure Cosmos DB change feed support to track changes in documents, event-based processing like triggers, and keep caches and analytic systems up-to-date --+++ Last updated 06/07/2021- # Change feed in Azure Cosmos DB
cosmos-db Convert Vcore To Request Unit https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/convert-vcore-to-request-unit.md
Title: 'Convert the number of vCores or vCPUs in your nonrelational database to Azure Cosmos DB RU/s' description: 'Convert the number of vCores or vCPUs in your nonrelational database to Azure Cosmos DB RU/s'--+++
cosmos-db Cosmos Db Reserved Capacity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cosmos-db-reserved-capacity.md
Title: Reserved capacity in Azure Cosmos DB to Optimize cost description: Learn how to buy Azure Cosmos DB reserved capacity to save on your compute costs.-+ Last updated 08/26/2021--++ # Optimize cost with reserved capacity in Azure Cosmos DB
cosmos-db Cosmosdb Migrationchoices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cosmosdb-migrationchoices.md
Title: Cosmos DB Migration options description: This doc describes the various options to migrate your on-premises or cloud data to Azure Cosmos DB--+++ Last updated 04/02/2022
cosmos-db Dedicated Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/dedicated-gateway.md
Title: Azure Cosmos DB dedicated gateway description: A dedicated gateway is compute that is a front-end to your Azure Cosmos DB account. When you connect to the dedicated gateway, it routes requests and caches data.-+ Last updated 11/08/2021-++ # Azure Cosmos DB dedicated gateway - Overview (Preview)
cosmos-db Distribute Data Globally https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/distribute-data-globally.md
Title: Distribute data globally with Azure Cosmos DB description: Learn about planet-scale geo-replication, multi-region writes, failover, and data recovery using global databases from Azure Cosmos DB, a globally distributed, multi-model database service.--+++ Last updated 01/06/2021
cosmos-db Global Dist Under The Hood https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/global-dist-under-the-hood.md
Title: Global distribution with Azure Cosmos DB- under the hood description: This article provides technical details relating to global distribution of Azure Cosmos DB-+ Last updated 07/02/2020---++ # Global data distribution with Azure Cosmos DB - under the hood
cosmos-db Hierarchical Partition Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/hierarchical-partition-keys.md
Queries that specify either the **TenantId**, or both **TenantId** and **UserId*
## Getting started > [!IMPORTANT]
-> Working with containers that use hierarchical partition keys is supported only in the preview versions of the .NET v3 and Java v4 SDK. You must use the supported SDK to create new containers with hierarchical partition keys and to perform CRUD/query operations on the data
+> Working with containers that use hierarchical partition keys is supported only in the preview versions of the .NET v3 and Java v4 SDK. You must use the supported SDK to create new containers with hierarchical partition keys and to perform CRUD/query operations on the data.
+> If you would like to use an SDK or connector that isn't currently supported, please file a request on our [community forum](https://feedback.azure.com/d365community/forum/3002b3be-0d25-ec11-b6e6-000d3a4f0858).
Find the latest preview version of each supported SDK:
cosmos-db How To Configure Integrated Cache https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-configure-integrated-cache.md
Title: How to configure the Azure Cosmos DB integrated cache description: Learn how to configure the Azure Cosmos DB integrated cache-+ Last updated 09/28/2021-++ # How to configure the Azure Cosmos DB integrated cache (Preview)
cosmos-db How To Restrict User Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-restrict-user-data.md
Title: Restrict user access to data operations only with Azure Cosmos DB description: Learn how to restrict access to data operations only with Azure Cosmos DB-+ Last updated 12/9/2019-++
cosmos-db Index Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/index-overview.md
Title: Indexing in Azure Cosmos DB description: Understand how indexing works in Azure Cosmos DB, different types of indexes such as Range, Spatial, composite indexes supported. -+ Last updated 08/26/2021-++ # Indexing in Azure Cosmos DB - Overview
cosmos-db Index Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/index-policy.md
Title: Azure Cosmos DB indexing policies description: Learn how to configure and change the default indexing policy for automatic indexing and greater performance in Azure Cosmos DB.-+ Last updated 12/07/2021-++ # Indexing policies in Azure Cosmos DB
cosmos-db Integrated Cache Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/integrated-cache-faq.md
Title: Azure Cosmos DB integrated cache frequently asked questions description: Frequently asked questions about the Azure Cosmos DB integrated cache.-+ Last updated 09/20/2021-++ # Azure Cosmos DB integrated cache frequently asked questions
cosmos-db Integrated Cache https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/integrated-cache.md
Title: Azure Cosmos DB integrated cache description: The Azure Cosmos DB integrated cache is an in-memory cache that helps you ensure manageable costs and low latency as your request volume grows.-+ Last updated 09/28/2021-++ # Azure Cosmos DB integrated cache - Overview (Preview)
cosmos-db Migrate Cosmosdb Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/migrate-cosmosdb-data.md
Title: Migrate hundreds of terabytes of data into Azure Cosmos DB description: This doc describes how you can migrate 100s of terabytes of data into Cosmos DB--+++
cosmos-db Tutorial Mongotools Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/tutorial-mongotools-cosmos-db.md
Title: Migrate MongoDB offline to Azure Cosmos DB API for MongoDB, using MongoDB native tools description: Learn how MongoDB native tools can be used to migrate small datasets from MongoDB instances to Azure Cosmos DB--+++ Last updated 08/26/2021- # Tutorial: Migrate MongoDB to Azure Cosmos DB's API for MongoDB offline using MongoDB native tools
cosmos-db Partners Migration Cosmosdb https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/partners-migration-cosmosdb.md
Title: Migration and application development partners for Azure Cosmos DB description: Lists Microsoft partners with migration solutions that support Azure Cosmos DB.--+++ Last updated 08/26/2021
cosmos-db Plan Manage Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/plan-manage-costs.md
Title: Plan and manage costs for Azure Cosmos DB description: Learn how to plan for and manage costs for Azure Cosmos DB by using cost analysis in Azure portal.--+++
cosmos-db Certificate Based Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/certificate-based-authentication.md
Title: Certificate-based authentication with Azure Cosmos DB and Active Directory description: Learn how to configure an Azure AD identity for certificate-based authentication to access keys from Azure Cosmos DB.-+ Last updated 06/11/2019--++ - # Certificate-based authentication for an Azure AD identity to access keys from an Azure Cosmos DB account
cosmos-db Change Feed Design Patterns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/change-feed-design-patterns.md
Title: Change feed design patterns in Azure Cosmos DB description: Overview of common change feed design patterns--+++ Last updated 03/24/2022- # Change feed design patterns in Azure Cosmos DB
cosmos-db Change Feed Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/change-feed-functions.md
Title: How to use Azure Cosmos DB change feed with Azure Functions description: Use Azure Functions to connect to Azure Cosmos DB change feed. Later you can create reactive Azure functions that are triggered on every new event.--+++ Last updated 10/14/2021- # Serverless event-based architectures with Azure Cosmos DB and Azure Functions
cosmos-db Change Feed Processor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/change-feed-processor.md
Title: Change feed processor in Azure Cosmos DB description: Learn how to use the Azure Cosmos DB change feed processor to read the change feed, the components of the change feed processor--+++ ms.devlang: csharp Last updated 04/05/2022-
cosmos-db Change Feed Pull Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/change-feed-pull-model.md
Title: Change feed pull model description: Learn how to use the Azure Cosmos DB change feed pull model to read the change feed and the differences between the pull model and Change Feed Processor--+++ ms.devlang: csharp Last updated 04/07/2022- # Change feed pull model in Azure Cosmos DB
cosmos-db Changefeed Ecommerce Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/changefeed-ecommerce-solution.md
Title: Use Azure Cosmos DB change feed to visualize real-time data analytics description: This article describes how change feed can be used by a retail company to understand user patterns, perform real-time data analysis and visualization--+++ ms.devlang: java - Last updated 03/24/2022
cosmos-db Create Sql Api Dotnet V4 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-dotnet-v4.md
Title: Manage Azure Cosmos DB SQL API resources using .NET V4 SDK description: Use this quickstart to build a console app by using the .NET V4 SDK to manage Azure Cosmos DB SQL API account resources.--+++ ms.devlang: csharp
cosmos-db Create Sql Api Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-dotnet.md
Title: Quickstart - Build a .NET console app to manage Azure Cosmos DB SQL API resources description: Learn how to build a .NET console app to manage Azure Cosmos DB SQL API account resources in this quickstart.--+++ ms.devlang: csharp
cosmos-db Create Sql Api Java Changefeed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-java-changefeed.md
Title: Create an end-to-end Azure Cosmos DB Java SDK v4 application sample by using Change Feed description: This guide walks you through a simple Java SQL API application which inserts documents into an Azure Cosmos DB container, while maintaining a materialized view of the container using Change Feed.-+ ms.devlang: java Last updated 06/11/2020-++
cosmos-db Create Sql Api Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-java.md
Title: Quickstart - Use Java to create a document database using Azure Cosmos DB description: This quickstart presents a Java code sample you can use to connect to and query the Azure Cosmos DB SQL API-+ ms.devlang: java Last updated 08/26/2021-++
cosmos-db Create Sql Api Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-nodejs.md
Title: Quickstart- Use Node.js to query from Azure Cosmos DB SQL API account description: How to use Node.js to create an app that connects to Azure Cosmos DB SQL API account and queries data.-+ ms.devlang: javascript Last updated 08/26/2021-++
cosmos-db Create Sql Api Spring Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-sql-api-spring-data.md
Title: Quickstart - Use Spring Data Azure Cosmos DB v3 to create a document database using Azure Cosmos DB description: This quickstart presents a Spring Data Azure Cosmos DB v3 code sample you can use to connect to and query the Azure Cosmos DB SQL API-+ ms.devlang: java Last updated 08/26/2021-++
cosmos-db How To Manage Conflicts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/how-to-manage-conflicts.md
Title: Manage conflicts between regions in Azure Cosmos DB description: Learn how to manage conflicts in Azure Cosmos DB by creating the last-writer-wins or a custom conflict resolution policy-+ Last updated 06/11/2020-++ ms.devlang: csharp, java, javascript
cosmos-db How To Manage Indexing Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/how-to-manage-indexing-policy.md
Title: Manage indexing policies in Azure Cosmos DB description: Learn how to manage indexing policies, include or exclude a property from indexing, how to define indexing using different Azure Cosmos DB SDKs-+ Last updated 05/25/2021-++
cosmos-db How To Use Stored Procedures Triggers Udfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/how-to-use-stored-procedures-triggers-udfs.md
Title: Register and use stored procedures, triggers, and user-defined functions in Azure Cosmos DB SDKs description: Learn how to register and call stored procedures, triggers, and user-defined functions using the Azure Cosmos DB SDKs-+ Last updated 11/03/2021-++ ms.devlang: csharp, java, javascript, python
cosmos-db How To Write Javascript Query Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/how-to-write-javascript-query-api.md
Title: Write stored procedures and triggers using the JavaScript query API in Azure Cosmos DB description: Learn how to write stored procedures and triggers using the JavaScript Query API in Azure Cosmos DB -+ Last updated 05/07/2020-++ ms.devlang: javascript
cosmos-db How To Write Stored Procedures Triggers Udfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/how-to-write-stored-procedures-triggers-udfs.md
Title: Write stored procedures, triggers, and UDFs in Azure Cosmos DB description: Learn how to define stored procedures, triggers, and user-defined functions in Azure Cosmos DB-+ Last updated 10/05/2021-++ ms.devlang: javascript
cosmos-db Index Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/index-metrics.md
Title: Azure Cosmos DB indexing metrics description: Learn how to obtain and interpret the indexing metrics in Azure Cosmos DB-+ Last updated 10/25/2021-++ # Indexing metrics in Azure Cosmos DB [!INCLUDE[appliesto-sql-api](../includes/appliesto-sql-api.md)]
cosmos-db Javascript Query Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/javascript-query-api.md
Title: Work with JavaScript integrated query API in Azure Cosmos DB Stored Procedures and Triggers description: This article introduces the concepts for JavaScript language-integrated query API to create stored procedures and triggers in Azure Cosmos DB.-+ Last updated 05/07/2020--++ ms.devlang: javascript
cosmos-db Migrate Java V4 Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/migrate-java-v4-sdk.md
Title: Migrate your application to use the Azure Cosmos DB Java SDK v4 (com.azure.cosmos) description: Learn how to upgrade your existing Java application from using the older Azure Cosmos DB Java SDKs to the newer Java SDK 4.0 (com.azure.cosmos package)for Core (SQL) API.-+ ms.devlang: java -++ Last updated 08/26/2021- # Migrate your application to use the Azure Cosmos DB Java SDK v4
cosmos-db Performance Testing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/performance-testing.md
Title: Performance and scale testing with Azure Cosmos DB description: Learn how to do scale and performance testing with Azure Cosmos DB. You can then evaluate the functionality of Azure Cosmos DB for high-performance application scenarios.--+++
cosmos-db Performance Tips Async Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/performance-tips-async-java.md
Title: Performance tips for Azure Cosmos DB Async Java SDK v2 description: Learn client configuration options to improve Azure Cosmos database performance for Async Java SDK v2-+ ms.devlang: java Last updated 05/11/2020-++
cosmos-db Performance Tips Java Sdk V4 Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/performance-tips-java-sdk-v4-sql.md
Title: Performance tips for Azure Cosmos DB Java SDK v4 description: Learn client configuration options to improve Azure Cosmos database performance for Java SDK v4-+ ms.devlang: java Last updated 04/22/2022-++
cosmos-db Performance Tips Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/performance-tips-java.md
Title: Performance tips for Azure Cosmos DB Sync Java SDK v2 description: Learn client configuration options to improve Azure Cosmos database performance for Sync Java SDK v2-+ ms.devlang: java Last updated 05/11/2020-++
cosmos-db Query Cheat Sheet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/query-cheat-sheet.md
Title: Azure Cosmos DB PDF query cheat sheets description: Printable PDF cheat sheets that helps you use Azure Cosmos DB's SQL, MongoDB, Graph, and Table APIs to query your data--+++
cosmos-db Read Change Feed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/read-change-feed.md
Title: Reading Azure Cosmos DB change feed description: This article describes different options available to read and access change feed in Azure Cosmos DB. --+++ Last updated 06/30/2021- # Reading Azure Cosmos DB change feed
cosmos-db Sql Api Java Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-java-application.md
Title: 'Tutorial: Build a Java web app using Azure Cosmos DB and the SQL API' description: 'Tutorial: This Java web application tutorial shows you how to use the Azure Cosmos DB and the SQL API to store and access data from a Java application hosted on Azure Websites.'-+ ms.devlang: java Last updated 03/29/2022-++ - # Tutorial: Build a Java web application using Azure Cosmos DB and the SQL API
cosmos-db Sql Api Java Sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-java-sdk-samples.md
Title: 'Azure Cosmos DB SQL API: Java SDK v4 examples' description: Find Java examples on GitHub for common tasks using the Azure Cosmos DB SQL API, including CRUD operations.-+ Last updated 08/26/2021 ms.devlang: java -++ # Azure Cosmos DB SQL API: Java SDK v4 examples
cosmos-db Sql Api Query Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-query-metrics.md
Title: SQL query metrics for Azure Cosmos DB SQL API description: Learn about how to instrument and debug the SQL query performance of Azure Cosmos DB requests.--+++
cosmos-db Sql Api Sdk Async Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-async-java.md
Title: 'Azure Cosmos DB: SQL Async Java API, SDK & resources' description: Learn all about the SQL Async Java API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Async Java SDK.-+ ms.devlang: java Last updated 11/11/2021-++
cosmos-db Sql Api Sdk Bulk Executor Dot Net https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-bulk-executor-dot-net.md
Title: 'Azure Cosmos DB: Bulk executor .NET API, SDK & resources' description: Learn all about the bulk executor .NET API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB bulk executor .NET SDK.-+ ms.devlang: csharp Last updated 04/06/2021-++ # .NET bulk executor library: Download information (Legacy)
cosmos-db Sql Api Sdk Bulk Executor Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-bulk-executor-java.md
Title: 'Azure Cosmos DB: Bulk executor Java API, SDK & resources' description: Learn all about the bulk executor Java API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB bulk executor Java SDK.-+ ms.devlang: java Last updated 04/06/2021-++
cosmos-db Sql Api Sdk Dotnet Changefeed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-dotnet-changefeed.md
Title: Azure Cosmos DB .NET change feed Processor API, SDK release notes description: Learn all about the Change Feed Processor API and SDK including release dates, retirement dates, and changes made between each version of the .NET Change Feed Processor SDK.-+ ms.devlang: csharp Last updated 04/06/2021-++ # .NET Change Feed Processor SDK: Download and release notes (Legacy)
cosmos-db Sql Api Sdk Dotnet Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-dotnet-core.md
Title: 'Azure Cosmos DB: SQL .NET Core API, SDK & resources' description: Learn all about the SQL .NET Core API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB .NET Core SDK.-+ ms.devlang: csharp Last updated 04/18/2022-++
cosmos-db Sql Api Sdk Dotnet Standard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-dotnet-standard.md
Title: 'Azure Cosmos DB: SQL .NET Standard API, SDK & resources' description: Learn all about the SQL API and .NET SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB .NET SDK.-+ ms.devlang: csharp Last updated 03/22/2022-++
cosmos-db Sql Api Sdk Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-dotnet.md
Title: 'Azure Cosmos DB: SQL .NET API, SDK & resources' description: Learn all about the SQL .NET API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB .NET SDK.-+ ms.devlang: csharp Last updated 04/18/2022-++
cosmos-db Sql Api Sdk Java Spark V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java-spark-v3.md
Title: 'Azure Cosmos DB Apache Spark 3 OLTP Connector for SQL API (Preview) release notes and resources' description: Learn about the Azure Cosmos DB Apache Spark 3 OLTP Connector for SQL API, including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Java SDK.-+ ms.devlang: java Last updated 11/12/2021-++
cosmos-db Sql Api Sdk Java Spark https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java-spark.md
Title: 'Azure Cosmos DB Apache Spark 2 OLTP Connector for SQL API release notes and resources' description: Learn about the Azure Cosmos DB Apache Spark 2 OLTP Connector for SQL API, including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Async Java SDK.-+ ms.devlang: java Last updated 04/06/2021-++
cosmos-db Sql Api Sdk Java Spring V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java-spring-v2.md
Title: 'Spring Data Azure Cosmos DB v2 for SQL API release notes and resources' description: Learn about the Spring Data Azure Cosmos DB v2 for SQL API, including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Async Java SDK.-+ ms.devlang: java Last updated 04/06/2021-++
cosmos-db Sql Api Sdk Java Spring V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java-spring-v3.md
Title: 'Spring Data Azure Cosmos DB v3 for SQL API release notes and resources' description: Learn about the Spring Data Azure Cosmos DB v3 for SQL API, including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Async Java SDK.-+ ms.devlang: java Last updated 04/06/2021-++
cosmos-db Sql Api Sdk Java V4 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java-v4.md
Title: 'Azure Cosmos DB Java SDK v4 for SQL API release notes and resources' description: Learn all about the Azure Cosmos DB Java SDK v4 for SQL API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Async Java SDK.-+ ms.devlang: java Last updated 04/06/2021-++
cosmos-db Sql Api Sdk Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java.md
Title: 'Azure Cosmos DB: SQL Java API, SDK & resources' description: Learn all about the SQL Java API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB SQL Java SDK.-+ ms.devlang: java Last updated 04/06/2021-++
cosmos-db Sql Api Sdk Node https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-node.md
Title: 'Azure Cosmos DB: SQL Node.js API, SDK & resources' description: Learn all about the SQL Node.js API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB Node.js SDK.-+ ms.devlang: javascript Last updated 12/09/2021-++
cosmos-db Sql Api Spring Data Sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-spring-data-sdk-samples.md
Title: 'Azure Cosmos DB SQL API: Spring Data v3 examples' description: Find Spring Data v3 examples on GitHub for common tasks using the Azure Cosmos DB SQL API, including CRUD operations.-+ Last updated 08/26/2021 -++ # Azure Cosmos DB SQL API: Spring Data Azure Cosmos DB v3 examples
cosmos-db Sql Query Aggregate Avg https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-aggregate-avg.md
Title: AVG in Azure Cosmos DB query language description: Learn about the Average (AVG) SQL system function in Azure Cosmos DB.-+ Last updated 12/02/2020-++ # AVG (Azure Cosmos DB)
cosmos-db Sql Query Aggregate Count https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-aggregate-count.md
Title: COUNT in Azure Cosmos DB query language description: Learn about the Count (COUNT) SQL system function in Azure Cosmos DB.-+ Last updated 12/02/2020-++ # COUNT (Azure Cosmos DB)
cosmos-db Sql Query Aggregate Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-aggregate-functions.md
Title: Aggregate functions in Azure Cosmos DB description: Learn about SQL aggregate function syntax, types of aggregate functions supported by Azure Cosmos DB.-+ Last updated 12/02/2020-++ # Aggregate functions in Azure Cosmos DB
cosmos-db Sql Query Aggregate Max https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-aggregate-max.md
Title: MAX in Azure Cosmos DB query language description: Learn about the Max (MAX) SQL system function in Azure Cosmos DB.-+ Last updated 12/02/2020-++ # MAX (Azure Cosmos DB)
cosmos-db Sql Query Aggregate Min https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-aggregate-min.md
Title: MIN in Azure Cosmos DB query language description: Learn about the Min (MIN) SQL system function in Azure Cosmos DB.-+ Last updated 12/02/2020-++ # MIN (Azure Cosmos DB)
cosmos-db Sql Query Aggregate Sum https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-aggregate-sum.md
Title: SUM in Azure Cosmos DB query language description: Learn about the Sum (SUM) SQL system function in Azure Cosmos DB.-+ Last updated 12/02/2020-++ # SUM (Azure Cosmos DB)
cosmos-db Sql Query Constants https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-constants.md
Title: SQL constants in Azure Cosmos DB description: Learn about how the SQL query constants in Azure Cosmos DB are used to represent a specific data value-+ Last updated 05/31/2019-++
cosmos-db Sql Query Datetimeadd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-datetimeadd.md
Title: DateTimeAdd in Azure Cosmos DB query language description: Learn about SQL system function DateTimeAdd in Azure Cosmos DB.-+ Last updated 07/09/2020-++ # DateTimeAdd (Azure Cosmos DB)
cosmos-db Sql Query Datetimediff https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-datetimediff.md
Title: DateTimeDiff in Azure Cosmos DB query language description: Learn about SQL system function DateTimeDiff in Azure Cosmos DB.-+ Last updated 07/09/2020-++ # DateTimeDiff (Azure Cosmos DB)
cosmos-db Sql Query Datetimefromparts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-datetimefromparts.md
Title: DateTimeFromParts in Azure Cosmos DB query language description: Learn about SQL system function DateTimeFromParts in Azure Cosmos DB.-+ Last updated 07/09/2020-++ # DateTimeFromParts (Azure Cosmos DB)
cosmos-db Sql Query Datetimepart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-datetimepart.md
Title: DateTimePart in Azure Cosmos DB query language description: Learn about SQL system function DateTimePart in Azure Cosmos DB.-+ Last updated 08/14/2020-++ # DateTimePart (Azure Cosmos DB)
cosmos-db Sql Query Datetimetoticks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-datetimetoticks.md
Title: DateTimeToTicks in Azure Cosmos DB query language description: Learn about SQL system function DateTimeToTicks in Azure Cosmos DB.-+ Last updated 08/18/2020-++ # DateTimeToTicks (Azure Cosmos DB)
cosmos-db Sql Query Datetimetotimestamp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-datetimetotimestamp.md
Title: DateTimeToTimestamp in Azure Cosmos DB query language description: Learn about SQL system function DateTimeToTimestamp in Azure Cosmos DB.-+ Last updated 08/18/2020-++ # DateTimeToTimestamp (Azure Cosmos DB)
cosmos-db Sql Query Equality Comparison Operators https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-equality-comparison-operators.md
Title: Equality and comparison operators in Azure Cosmos DB description: Learn about SQL equality and comparison operators supported by Azure Cosmos DB.-+ Last updated 01/07/2022-++ # Equality and comparison operators in Azure Cosmos DB
cosmos-db Sql Query From https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-from.md
Title: FROM clause in Azure Cosmos DB description: Learn about the SQL syntax, and example for FROM clause for Azure Cosmos DB. This article also shows examples to scope results, and get sub items by using the FROM clause.-+ Last updated 05/08/2020-++ # FROM clause in Azure Cosmos DB
cosmos-db Sql Query Geospatial Index https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-geospatial-index.md
Title: Index geospatial data with Azure Cosmos DB description: Index spatial data with Azure Cosmos DB-+ Last updated 11/03/2020-++ # Index geospatial data with Azure Cosmos DB
cosmos-db Sql Query Geospatial Intro https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-geospatial-intro.md
Title: Geospatial and GeoJSON location data in Azure Cosmos DB description: Understand how to create spatial objects with Azure Cosmos DB and the SQL API.-+ Last updated 02/17/2022-++ # Geospatial and GeoJSON location data in Azure Cosmos DB
cosmos-db Sql Query Geospatial Query https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-geospatial-query.md
Title: Querying geospatial data with Azure Cosmos DB description: Querying spatial data with Azure Cosmos DB-+ Last updated 02/20/2020-++ # Querying geospatial data with Azure Cosmos DB
cosmos-db Sql Query Getcurrentdatetime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-getcurrentdatetime.md
Title: GetCurrentDateTime in Azure Cosmos DB query language description: Learn about SQL system function GetCurrentDateTime in Azure Cosmos DB.-+ Last updated 02/03/2021-++ # GetCurrentDateTime (Azure Cosmos DB)
cosmos-db Sql Query Getcurrentticks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-getcurrentticks.md
Title: GetCurrentTicks in Azure Cosmos DB query language description: Learn about SQL system function GetCurrentTicks in Azure Cosmos DB.-+ Last updated 02/03/2021-++ # GetCurrentTicks (Azure Cosmos DB)
cosmos-db Sql Query Getcurrenttimestamp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-getcurrenttimestamp.md
Title: GetCurrentTimestamp in Azure Cosmos DB query language description: Learn about SQL system function GetCurrentTimestamp in Azure Cosmos DB.-+ Last updated 02/03/2021-++ # GetCurrentTimestamp (Azure Cosmos DB)
cosmos-db Sql Query Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-getting-started.md
Title: Getting started with SQL queries in Azure Cosmos DB description: Learn how to use SQL queries to query data from Azure Cosmos DB. You can upload sample data to a container in Azure Cosmos DB and query it. -+ Last updated 08/26/2021-++ # Getting started with SQL queries
cosmos-db Sql Query Group By https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-group-by.md
Title: GROUP BY clause in Azure Cosmos DB description: Learn about the GROUP BY clause for Azure Cosmos DB.-+ Last updated 05/12/2022-++ # GROUP BY clause in Azure Cosmos DB
cosmos-db Sql Query Join https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-join.md
Title: SQL JOIN queries for Azure Cosmos DB description: Learn how to JOIN multiple tables in Azure Cosmos DB to query the data-+ Last updated 08/27/2021-++ # Joins in Azure Cosmos DB
cosmos-db Sql Query Keywords https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-keywords.md
Title: SQL keywords for Azure Cosmos DB description: Learn about SQL keywords for Azure Cosmos DB.-+ Last updated 10/05/2021-++ # Keywords in Azure Cosmos DB
cosmos-db Sql Query Linq To Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-linq-to-sql.md
Title: LINQ to SQL translation in Azure Cosmos DB description: Learn the LINQ operators supported and how the LINQ queries are mapped to SQL queries in Azure Cosmos DB.-+ Last updated 08/06/2021-++ # LINQ to SQL translation
cosmos-db Sql Query Logical Operators https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-logical-operators.md
Title: Logical operators in Azure Cosmos DB description: Learn about SQL logical operators supported by Azure Cosmos DB.-+ Last updated 01/07/2022-++ # Logical operators in Azure Cosmos DB
cosmos-db Sql Query Object Array https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-object-array.md
Title: Working with arrays and objects in Azure Cosmos DB description: Learn the SQL syntax to create arrays and objects in Azure Cosmos DB. This article also provides some examples to perform operations on array objects -+ Last updated 02/02/2021-++ # Working with arrays and objects in Azure Cosmos DB
cosmos-db Sql Query Offset Limit https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-offset-limit.md
Title: OFFSET LIMIT clause in Azure Cosmos DB description: Learn how to use the OFFSET LIMIT clause to skip and take some certain values when querying in Azure Cosmos DB-+ Last updated 07/29/2020-++ # OFFSET LIMIT clause in Azure Cosmos DB
cosmos-db Sql Query Order By https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-order-by.md
Title: ORDER BY clause in Azure Cosmos DB description: Learn about SQL ORDER BY clause for Azure Cosmos DB. Use SQL as an Azure Cosmos DB JSON query language.-+ Last updated 04/27/2022-++ # ORDER BY clause in Azure Cosmos DB
cosmos-db Sql Query Pagination https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-pagination.md
Title: Pagination in Azure Cosmos DB description: Learn about paging concepts and continuation tokens--+++
cosmos-db Sql Query Parameterized Queries https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-parameterized-queries.md
Title: Parameterized queries in Azure Cosmos DB description: Learn how SQL parameterized queries provide robust handling and escaping of user input, and prevent accidental exposure of data through SQL injection.-+ Last updated 07/29/2020-++ # Parameterized queries in Azure Cosmos DB [!INCLUDE[appliesto-sql-api](../includes/appliesto-sql-api.md)]
cosmos-db Sql Query Regexmatch https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-regexmatch.md
Title: RegexMatch in Azure Cosmos DB query language description: Learn about the RegexMatch SQL system function in Azure Cosmos DB-+ Last updated 08/12/2021-++ # REGEXMATCH (Azure Cosmos DB)
cosmos-db Sql Query Select https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-select.md
Title: SELECT clause in Azure Cosmos DB description: Learn about SQL SELECT clause for Azure Cosmos DB. Use SQL as an Azure Cosmos DB JSON query language.-+ Last updated 05/08/2020-++ # SELECT clause in Azure Cosmos DB
cosmos-db Sql Query String Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-string-functions.md
Title: String functions in Azure Cosmos DB query language description: Learn about string SQL system functions in Azure Cosmos DB.-+ Last updated 05/26/2021-++ # String functions (Azure Cosmos DB)
cosmos-db Sql Query Stringequals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-stringequals.md
Title: StringEquals in Azure Cosmos DB query language description: Learn about how the StringEquals SQL system function in Azure Cosmos DB returns a Boolean indicating whether the first string expression matches the second-+ Last updated 05/20/2020-++ # STRINGEQUALS (Azure Cosmos DB)
cosmos-db Sql Query Subquery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-subquery.md
Title: SQL subqueries for Azure Cosmos DB description: Learn about SQL subqueries and their common use cases and different types of subqueries in Azure Cosmos DB-+ Last updated 07/30/2021-++ # SQL subquery examples for Azure Cosmos DB
cosmos-db Sql Query Ternary Coalesce Operators https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-ternary-coalesce-operators.md
Title: Ternary and coalesce operators in Azure Cosmos DB description: Learn about SQL ternary and coalesce operators supported by Azure Cosmos DB.-+ Last updated 01/07/2022-++ # Ternary and coalesce operators in Azure Cosmos DB
cosmos-db Sql Query Tickstodatetime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-tickstodatetime.md
Title: TicksToDateTime in Azure Cosmos DB query language description: Learn about SQL system function TicksToDateTime in Azure Cosmos DB.-+ Last updated 08/18/2020-++ # TicksToDateTime (Azure Cosmos DB)
cosmos-db Sql Query Timestamptodatetime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-timestamptodatetime.md
Title: TimestampToDateTime in Azure Cosmos DB query language description: Learn about SQL system function TimestampToDateTime in Azure Cosmos DB.-+ Last updated 08/18/2020-++ # TimestampToDateTime (Azure Cosmos DB)
cosmos-db Sql Query Type Checking Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-type-checking-functions.md
Title: Type checking functions in Azure Cosmos DB query language description: Learn about type checking SQL system functions in Azure Cosmos DB.-+ Last updated 05/26/2021-++ # Type checking functions (Azure Cosmos DB)
cosmos-db Sql Query Udfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-udfs.md
Title: User-defined functions (UDFs) in Azure Cosmos DB description: Learn about User-defined functions in Azure Cosmos DB.-+ Last updated 04/09/2020-++
cosmos-db Sql Query Where https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-where.md
Title: WHERE clause in Azure Cosmos DB description: Learn about SQL WHERE clause for Azure Cosmos DB-+ Last updated 03/06/2020-++ # WHERE clause in Azure Cosmos DB
cosmos-db Sql Query Working With Json https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-working-with-json.md
Title: Working with JSON in Azure Cosmos DB description: Learn about to query and access nested JSON properties and use special characters in Azure Cosmos DB-+ Last updated 09/19/2020-++ # Working with JSON in Azure Cosmos DB
cosmos-db Stored Procedures Triggers Udfs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/stored-procedures-triggers-udfs.md
Title: Work with stored procedures, triggers, and UDFs in Azure Cosmos DB description: This article introduces the concepts such as stored procedures, triggers, and user-defined functions in Azure Cosmos DB.-+ Last updated 08/26/2021---++ # Stored procedures, triggers, and user-defined functions
cosmos-db Troubleshoot Dot Net Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-dot-net-sdk.md
Title: Diagnose and troubleshoot issues when using Azure Cosmos DB .NET SDK description: Use features like client-side logging and other third-party tools to identify, diagnose, and troubleshoot Azure Cosmos DB issues when using .NET SDK.-+ Last updated 03/05/2021-++ - # Diagnose and troubleshoot issues when using Azure Cosmos DB .NET SDK
cosmos-db Troubleshoot Java Async Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-java-async-sdk.md
Title: Diagnose and troubleshoot Azure Cosmos DB Async Java SDK v2 description: Use features like client-side logging and other third-party tools to identify, diagnose, and troubleshoot Azure Cosmos DB issues in Async Java SDK v2.-+ Last updated 05/11/2020-++ ms.devlang: java -
cosmos-db Troubleshoot Query Performance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-query-performance.md
Title: Troubleshoot query issues when using Azure Cosmos DB description: Learn how to identify, diagnose, and troubleshoot Azure Cosmos DB SQL query issues.-+ Last updated 04/04/2022-++ - # Troubleshoot query issues when using Azure Cosmos DB [!INCLUDE[appliesto-sql-api](../includes/appliesto-sql-api.md)]
cosmos-db Tutorial Springboot Azure Kubernetes Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/tutorial-springboot-azure-kubernetes-service.md
Title: Tutorial - Spring Boot application with Azure Cosmos DB SQL API and Azure Kubernetes Service description: This tutorial demonstrates how to deploy a Spring Boot application to Azure Kubernetes Service and use it to perform operations on data in an Azure Cosmos DB SQL API account.-+ ms.devlang: java Last updated 10/01/2021-++
cosmos-db Working With Dates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/working-with-dates.md
Title: Working with dates in Azure Cosmos DB
description: Learn how to store, index, and query DataTime objects in Azure Cosmos DB --+++ Last updated 04/03/2020 ms.devlang: csharp
cosmos-db Create Table Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/table/create-table-dotnet.md
Title: 'Quickstart: Table API with .NET - Azure Cosmos DB' description: This quickstart shows how to access the Azure Cosmos DB Table API from a .NET application using the Azure.Data.Tables SDK-+ ms.devlang: csharp Last updated 09/26/2021-++
data-factory How To Sqldb To Cosmosdb https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/how-to-sqldb-to-cosmosdb.md
SQL schemas are typically modeled using third normal form, resulting in normaliz
Using Azure Data Factory, we'll build a pipeline that uses a single Mapping Data Flow to read from two Azure SQL Database normalized tables that contain primary and foreign keys as the entity relationship. ADF will join those tables into a single stream using the data flow Spark engine, collect joined rows into arrays and produce individual cleansed documents for insert into a new Azure Cosmos DB container.
-This guide will build a new container on the fly called "orders" that will use the ```SalesOrderHeader``` and ```SalesOrderDetail``` tables from the standard SQL Server [Adventure Works sample database](/sql/samples/adventureworks-install-configure?tabs=ssms&view=sql-server-ver15). Those tables represent sales transactions joined by ```SalesOrderID```. Each unique detail records has its own primary key of ```SalesOrderDetailID```. The relationship between header and detail is ```1:M```. We'll join on ```SalesOrderID``` in ADF and then roll each related detail record into an array called "detail".
+This guide will build a new container on the fly called "orders" that will use the ```SalesOrderHeader``` and ```SalesOrderDetail``` tables from the standard SQL Server [Adventure Works sample database](/sql/samples/adventureworks-install-configure?tabs=ssms). Those tables represent sales transactions joined by ```SalesOrderID```. Each unique detail records has its own primary key of ```SalesOrderDetailID```. The relationship between header and detail is ```1:M```. We'll join on ```SalesOrderID``` in ADF and then roll each related detail record into an array called "detail".
The representative SQL query for this guide is:
If everything looks good, you are now ready to create a new pipeline, add this d
## Next steps * Build the rest of your data flow logic by using mapping data flows [transformations](concepts-data-flow-overview.md).
-* [Download the completed pipeline template](https://github.com/kromerm/adfdataflowdocs/blob/master/sampledata/SQL%20Orders%20to%20CosmosDB.zip) for this tutorial and import the template into your factory.
+* [Download the completed pipeline template](https://github.com/kromerm/adfdataflowdocs/blob/master/sampledata/SQL%20Orders%20to%20CosmosDB.zip) for this tutorial and import the template into your factory.
data-lake-store Data Lake Store Integrate With Other Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-lake-store/data-lake-store-integrate-with-other-services.md
description: Understand how you can integrate Azure Data Lake Storage Gen1 with
Previously updated : 05/29/2018 Last updated : 06/03/2022
You can register data from Data Lake Storage Gen1 into the Azure Data Catalog to
## Use Data Lake Storage Gen1 with SQL Server Integration Services (SSIS) You can use the Data Lake Storage Gen1 connection manager in SSIS to connect an SSIS package with Data Lake Storage Gen1. For more information, see [Use Data Lake Storage Gen1 with SSIS](/sql/integration-services/connection-manager/azure-data-lake-store-connection-manager).
-## Use Data Lake Storage Gen1 with Azure Synapse Analytics
-You can use PolyBase to load data from Data Lake Storage Gen1 into Azure Synapse Analytics. For more information see [Use Data Lake Storage Gen1 with Azure Synapse Analytics](../synapse-analytics/sql-data-warehouse/sql-data-warehouse-load-from-azure-data-lake-store.md).
- ## Use Data Lake Storage Gen1 with Azure Event Hubs You can use Azure Data Lake Storage Gen1 to archive and capture data received by Azure Event Hubs. For more information see [Use Data Lake Storage Gen1 with Azure Event Hubs](data-lake-store-archive-eventhub-capture.md).
machine-learning How To Auto Train Nlp Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-auto-train-nlp-models.md
ml_client.jobs.stream(returned_job.name)
See the following sample YAML files for each NLP task.
-* [Multi-class text classification](https://github.com/Azure/azureml-examples/blob/april-sdk-preview/cli/jobs/automl-standalone-jobs/cli-automl-text-classification-newsgroup/cli-automl-text-classification-newsgroup.yml)
-* [Multi-label text classification](https://github.com/Azure/azureml-examples/blob/april-sdk-preview/cli/jobs/automl-standalone-jobs/cli-automl-text-classification-multilabel-paper-cat/cli-automl-text-classification-multilabel-paper-cat.yml)
-* [Named entity recognition](https://github.com/Azure/azureml-examples/blob/april-sdk-preview/cli/jobs/automl-standalone-jobs/cli-automl-text-ner-conll/cli-automl-text-ner-conll2003.yml)
+* [Multi-class text classification](https://github.com/Azure/azureml-examples/blob/main/cli/jobs/automl-standalone-jobs/cli-automl-text-classification-newsgroup/cli-automl-text-classification-newsgroup.yml)
+* [Multi-label text classification](https://github.com/Azure/azureml-examples/blob/main/cli/jobs/automl-standalone-jobs/cli-automl-text-classification-multilabel-paper-cat/cli-automl-text-classification-multilabel-paper-cat.yml)
+* [Named entity recognition](https://github.com/Azure/azureml-examples/blob/main/cli/jobs/automl-standalone-jobs/cli-automl-text-ner-conll/cli-automl-text-ner-conll2003.yml)
# [Python SDK v2 (preview)](#tab/SDK-v2)
See the following sample YAML files for each NLP task.
See the sample notebooks for detailed code examples for each NLP task.
-* [Multi-class text classification](https://github.com/Azure/azureml-examples/blob/april-sdk-preview/sdk/jobs/automl-standalone-jobs/automl-nlp-text-classification-multiclass-task-sentiment-analysis/automl-nlp-text-classification-multiclass-task-sentiment.ipynb)
+* [Multi-class text classification](https://github.com/Azure/azureml-examples/blob/main/sdk/jobs/automl-standalone-jobs/automl-nlp-text-classification-multiclass-task-sentiment-analysis/automl-nlp-text-classification-multiclass-task-sentiment.ipynb)
* [Multi-label text classification](
-https://github.com/Azure/azureml-examples/blob/april-sdk-preview/sdk/jobs/automl-standalone-jobs/automl-nlp-text-classification-multilabel-task-paper-categorization/automl-nlp-text-classification-multilabel-task-paper-cat.ipynb)
-* [Named entity recognition](https://github.com/Azure/azureml-examples/blob/april-sdk-preview/sdk/jobs/automl-standalone-jobs/automl-nlp-text-named-entity-recognition-task/automl-nlp-text-ner-task.ipynb)
+https://github.com/Azure/azureml-examples/blob/main/sdk/jobs/automl-standalone-jobs/automl-nlp-text-classification-multilabel-task-paper-categorization/automl-nlp-text-classification-multilabel-task-paper-cat.ipynb)
+* [Named entity recognition](https://github.com/Azure/azureml-examples/blob/main/sdk/jobs/automl-standalone-jobs/automl-nlp-text-named-entity-recognition-task/automl-nlp-text-ner-task.ipynb)
machine-learning How To Deploy Managed Online Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-deploy-managed-online-endpoints.md
For more information about the YAML schema, see the [online endpoint YAML refere
> [!NOTE] > To use Kubernetes instead of managed endpoints as a compute target: > 1. Create and attach your Kubernetes cluster as a compute target to your Azure Machine Learning workspace by using [Azure Machine Learning studio](how-to-attach-kubernetes-anywhere.md?&tabs=studio#attach-a-kubernetes-cluster-to-an-azureml-workspace).
-> 1. Use the [endpoint YAML](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/amlarc/endpoint.yml) to target Kubernetes instead of the managed endpoint YAML. You'll need to edit the YAML to change the value of `target` to the name of your registered compute target. You can use this [deployment.yaml](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/amlarc/blue-deployment.yml) that has additional properties applicable to Kubernetes deployment.
+> 1. Use the [endpoint YAML](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/managed/sample/endpoint.yml) to target Kubernetes instead of the managed endpoint YAML. You'll need to edit the YAML to change the value of `target` to the name of your registered compute target. You can use this [deployment.yaml](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/managed/sample/blue-deployment.yml) that has additional properties applicable to Kubernetes deployment.
> > All the commands that are used in this article (except the optional SLA monitoring and Azure Log Analytics integration) can be used either with managed endpoints or with Kubernetes endpoints.
machine-learning How To Responsible Ai Dashboard Sdk Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-responsible-ai-dashboard-sdk-cli.md
The supplied datasets should be file datasets (uri_file type) in Parquet format.
- Learn more about the [concepts and techniques behind the Responsible AI dashboard](concept-responsible-ai-dashboard.md). - Learn more about how to [collect data responsibly](concept-sourcing-human-data.md) - View [sample YAML and Python notebooks](https://aka.ms/RAIsamples) to generate a Responsible AI dashboard with YAML or Python.
+- Learn more about how the Responsible AI Dashboard and Scorecard can be used to debug data and models and inform better decision making in this [tech community blog post](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
+- Learn about how the Responsible AI Dashboard and Scorecard were used by the NHS in a [real life customer story](https://aka.ms/NHSCustomerStory)
+- Explore the features of the Responsible AI Dashboard through this [interactive AI Lab web demo](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
machine-learning How To Responsible Ai Dashboard Ui https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-responsible-ai-dashboard-ui.md
After youΓÇÖve finished your experiment configuration, select **Create** to star
- Summarize and share your Responsible AI insights with the [Responsible AI scorecard as a PDF export](how-to-responsible-ai-scorecard.md). - Learn more about the [concepts and techniques behind the Responsible AI dashboard](concept-responsible-ai-dashboard.md). - Learn more about how to [collect data responsibly](concept-sourcing-human-data.md)
+- Learn more about how the Responsible AI Dashboard and Scorecard can be used to debug data and models and inform better decision making in this [tech community blog post](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
+- Learn about how the Responsible AI Dashboard and Scorecard were used by the NHS in a [real life customer story](https://aka.ms/NHSCustomerStory)
+- Explore the features of the Responsible AI Dashboard through this [interactive AI Lab web demo](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
machine-learning How To Responsible Ai Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-responsible-ai-dashboard.md
Selecting the Treatment policy tab switches to a view to help determine real-wor
- Summarize and share your Responsible AI insights with the [Responsible AI scorecard as a PDF export](how-to-responsible-ai-scorecard.md). - Learn more about the [concepts and techniques behind the Responsible AI dashboard](concept-responsible-ai-dashboard.md). - View [sample YAML and Python notebooks](https://aka.ms/RAIsamples) to generate a Responsible AI dashboard with YAML or Python.
+- Explore the features of the Responsible AI Dashboard through this [interactive AI Lab web demo](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
+- Learn more about how the Responsible AI Dashboard and Scorecard can be used to debug data and models and inform better decision making in this [tech community blog post](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
+- Learn about how the Responsible AI Dashboard and Scorecard were used by the NHS in a [real life customer story](https://aka.ms/NHSCustomerStory)
machine-learning How To Responsible Ai Scorecard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-responsible-ai-scorecard.md
Finally, you can observe your datasetΓÇÖs causal insights summarized, figuring o
- See the how-to guide for generating a Responsible AI dashboard via [CLIv2 and SDKv2](how-to-responsible-ai-dashboard-sdk-cli.md) or [studio UI ](how-to-responsible-ai-dashboard-ui.md). - Learn more about the [concepts and techniques behind the Responsible AI dashboard](concept-responsible-ai-dashboard.md). - View [sample YAML and Python notebooks](https://aka.ms/RAIsamples) to generate a Responsible AI dashboard with YAML or Python.
+- Learn more about how the Responsible AI Dashboard and Scorecard can be used to debug data and models and inform better decision making in this [tech community blog post](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
+- See how the Responsible AI Dashboard and Scorecard were used by the NHS in a [real life customer story](https://aka.ms/NHSCustomerStory)
+- Explore the features of the Responsible AI Dashboard through this [interactive AI Lab web demo](https://www.microsoft.com/ai/ai-lab-responsible-ai-dashboard)
machine-learning How To Secure Online Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-secure-online-endpoint.md
The following diagram shows how communications flow through private endpoints to
* You must have an Azure Resource Group, in which you (or the service principal you use) need to have `Contributor` access. You'll have such a resource group if you configured your ML extension per the above article.
-* You must have an Azure Machine Learning workspace, and the workspace must use a private endpoint. If you don't have one, the steps in this article create an example workspace, VNet, and VM. For more information, see [Configure a private endpoint for Azure Machine Learning workspace](how-to-configure-private-link.md).
+* You must have an Azure Machine Learning workspace, and the workspace must use a private endpoint. If you don't have one, the steps in this article create an example workspace, VNet, and VM. For more information, see [Configure a private endpoint for Azure Machine Learning workspace](/azure/machine-learning/how-to-configure-private-link).
* The Azure Container Registry for your workspace must be configured for __Premium__ tier. For more information, see [Azure Container Registry service tiers](../container-registry/container-registry-skus.md).
The following diagram shows how communications flow through private endpoints to
## Limitations * The `v1_legacy_mode` flag must be disabled (false) on your Azure Machine Learning workspace. If this flag is enabled, you won't be able to create a managed online endpoint. For more information, see [Network isolation with v2 API](how-to-configure-network-isolation-with-v2.md).
-* If your Azure Machine Learning workspace has a private endpoint that was created before May 24, 2022, you must recreate the workspace's private endpoint before configuring your online endpoints to use a private endpoint. For more information on creating a private endpoint for your workspace, see [How to configure a private endpoint for Azure Machine Learning workspace](how-to-configure-private-link.md).
+* If your Azure Machine Learning workspace has a private endpoint that was created before May 24, 2022, you must recreate the workspace's private endpoint before configuring your online endpoints to use a private endpoint. For more information on creating a private endpoint for your workspace, see [How to configure a private endpoint for Azure Machine Learning workspace](/azure/machine-learning/how-to-configure-private-link).
* Secure outbound communication creates three private endpoints per deployment. One to Azure Blob storage, one to Azure Container Registry, and one to your workspace.
To secure scoring requests to the online endpoint to your virtual network, set t
az ml online-endpoint create -f endpoint.yml --set public_network_access=disabled ```
-When `public_network_access` is `disabled`, inbound scoring requests are received using the [private endpoint of the Azure Machine Learning workspace](how-to-configure-private-link.md) and the endpoint can't be reached from public networks.
+When `public_network_access` is `disabled`, inbound scoring requests are received using the [private endpoint of the Azure Machine Learning workspace](/azure/machine-learning/how-to-configure-private-link) and the endpoint can't be reached from public networks.
## Outbound (resource access)
az group delete --resource-group <resource-group-name>
- [How to autoscale managed online endpoints](how-to-autoscale-endpoints.md) - [View costs for an Azure Machine Learning managed online endpoint](how-to-view-online-endpoints-costs.md) - [Access Azure resources with a online endpoint and managed identity](how-to-access-resources-from-endpoints-managed-identities.md)-- [Troubleshoot online endpoints deployment](how-to-troubleshoot-online-endpoints.md)
+- [Troubleshoot online endpoints deployment](how-to-troubleshoot-online-endpoints.md)
machine-learning How To Access Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/v1/how-to-access-data.md
For situations where the SDK doesn't provide access to datastores, you might be
## Move data to supported Azure storage solutions
-Azure Machine Learning supports accessing data from Azure Blob storage, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, and Azure Database for PostgreSQL. If you're using unsupported storage, we recommend that you move your data to supported Azure storage solutions by using [Azure Data Factory and these steps](/azure/data-factory/quickstart-create-data-factory-copy-data-tool.). Moving data to supported storage can help you save data egress costs during machine learning experiments.
+Azure Machine Learning supports accessing data from Azure Blob storage, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, and Azure Database for PostgreSQL. If you're using unsupported storage, we recommend that you move your data to supported Azure storage solutions by using [Azure Data Factory and these steps](/azure/data-factory/quickstart-create-data-factory-copy-data-tool). Moving data to supported storage can help you save data egress costs during machine learning experiments.
Azure Data Factory provides efficient and resilient data transfer with more than 80 prebuilt connectors at no extra cost. These connectors include Azure data services, on-premises data sources, Amazon S3 and Redshift, and Google BigQuery.
Azure Data Factory provides efficient and resilient data transfer with more than
* [Create an Azure machine learning dataset](how-to-create-register-datasets.md) * [Train a model](../how-to-set-up-training-targets.md)
-* [Deploy a model](../how-to-deploy-and-where.md)
+* [Deploy a model](../how-to-deploy-and-where.md)
machine-learning How To Use Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/v1/how-to-use-managed-identities.md
> * [v1](how-to-use-managed-identities.md) > * [v2 (current version)](../how-to-use-managed-identities.md)
-[Managed identities](/active-directory/managed-identities-azure-resources/overview) allow you to configure your workspace with the *minimum required permissions to access resources*.
+[Managed identities](/azure/active-directory/managed-identities-azure-resources/overview) allow you to configure your workspace with the *minimum required permissions to access resources*.
When configuring Azure Machine Learning workspace in trustworthy manner, it is important to ensure that different services associated with the workspace have the correct level of access. For example, during machine learning workflow the workspace needs access to Azure Container Registry (ACR) for Docker images, and storage accounts for training data.
In this article, you'll learn how to use managed identities to:
- An Azure Machine Learning workspace. For more information, see [Create an Azure Machine Learning workspace](../how-to-manage-workspace.md). - The [Azure CLI extension for Machine Learning service](reference-azure-machine-learning-cli.md) - The [Azure Machine Learning Python SDK](/python/api/overview/azure/ml/intro).-- To assign roles, the login for your Azure subscription must have the [Managed Identity Operator](/role-based-access-control/built-in-roles#managed-identity-operator) role, or other role that grants the required actions (such as __Owner__).-- You must be familiar with creating and working with [Managed Identities](/active-directory/managed-identities-azure-resources/overview).
+- To assign roles, the login for your Azure subscription must have the [Managed Identity Operator](/azure/role-based-access-control/built-in-roles#managed-identity-operator) role, or other role that grants the required actions (such as __Owner__).
+- You must be familiar with creating and working with [Managed Identities](/azure/active-directory/managed-identities-azure-resources/overview).
## Configure managed identities
You can bring your own ACR with admin user disabled when you create the workspac
If ACR admin user is disallowed by subscription policy, you should first create ACR without admin user, and then associate it with the workspace. Also, if you have existing ACR with admin user disabled, you can attach it to the workspace.
-[Create ACR from Azure CLI](/container-registry/container-registry-get-started-azure-cli) without setting ```--admin-enabled``` argument, or from Azure portal without enabling admin user. Then, when creating Azure Machine Learning workspace, specify the Azure resource ID of the ACR. The following example demonstrates creating a new Azure ML workspace that uses an existing ACR:
+[Create ACR from Azure CLI](/azure/container-registry/container-registry-get-started-azure-cli) without setting ```--admin-enabled``` argument, or from Azure portal without enabling admin user. Then, when creating Azure Machine Learning workspace, specify the Azure resource ID of the ACR. The following example demonstrates creating a new Azure ML workspace that uses an existing ACR:
> [!TIP] > To get the value for the `--container-registry` parameter, use the [az acr show](/cli/azure/acr#az-acr-show) command to show information for your ACR. The `id` field contains the resource ID for your ACR.
Once you've configured ACR without admin user as described earlier, you can acce
## Create workspace with user-assigned managed identity
-When creating a workspace, you can bring your own [user-assigned managed identity](/active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-cli) that will be used to access the associated resources: ACR, KeyVault, Storage, and App Insights.
+When creating a workspace, you can bring your own [user-assigned managed identity](/azure/active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities) that will be used to access the associated resources: ACR, KeyVault, Storage, and App Insights.
> [!IMPORTANT] > When creating workspace with user-assigned managed identity, you must create the associated resources yourself, and grant the managed identity roles on those resources. Use the [role assignment ARM template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.machinelearningservices/machine-learning-dependencies-role-assignment) to make the assignments.
postgresql Quickstart Create Server Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/flexible-server/quickstart-create-server-cli.md
Create an [Azure resource group](../../azure-resource-manager/management/overvie
az group create --name myresourcegroup --location westus ```
-Create a flexible server with the `az postgres flexible-server create` command. A server can contain multiple databases. The following command creates a server using service defaults and values from your Azure CLI's [local context](/cli/azure/local-context):
+Create a flexible server with the `az postgres flexible-server create` command. A server can contain multiple databases. The following command creates a server using service defaults and values from your Azure CLI's [local context](/cli/local-context):
```azurecli az postgres flexible-server create
private-link Private Endpoint Dns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/private-endpoint-dns.md
For Azure services, use the recommended zone names as described in the following
| Storage account (Microsoft.Storage/storageAccounts) / File (file, file_secondary) | privatelink.file.core.usgovcloudapi.net | file.core.usgovcloudapi.net | | Storage account (Microsoft.Storage/storageAccounts) / Web (web, web_secondary) | privatelink.web.core.usgovcloudapi.net | web.core.usgovcloudapi.net | | Azure Cosmos DB (Microsoft.AzureCosmosDB/databaseAccounts) / Sql | privatelink.documents.azure.us | documents.azure.us |
-| Azure Batch (Microsoft.Batch/batchAccounts) / batchAccount | privatelink.{region}.batch.usgovcloudapi.net | {region}.batch.usgovcloudapi.net |
+| Azure Batch (Microsoft.Batch/batchAccounts) / batchAccount | privatelink.batch.usgovcloudapi.net | {region}.batch.usgovcloudapi.net |
+| Azure Batch (Microsoft.Batch/batchAccounts) / nodeManagement | privatelink.batch.usgovcloudapi.net | {region}.service.batch.usgovcloudapi.net |
| Azure Database for PostgreSQL - Single server (Microsoft.DBforPostgreSQL/servers) / postgresqlServer | privatelink.postgres.database.usgovcloudapi.net | postgres.database.usgovcloudapi.net | | Azure Database for MySQL (Microsoft.DBforMySQL/servers) / mysqlServer | privatelink.mysql.database.usgovcloudapi.net | mysql.database.usgovcloudapi.net| | Azure Database for MariaDB (Microsoft.DBforMariaDB/servers) / mariadbServer | privatelink.mariadb.database.usgovcloudapi.net| mariadb.database.usgovcloudapi.net |
Based on your preferences, the following scenarios are available with DNS resolu
- [Azure Private Endpoint DNS configuration](#azure-private-endpoint-dns-configuration) - [Azure services DNS zone configuration](#azure-services-dns-zone-configuration)
+ - [Government](#government)
- [China](#china) - [DNS configuration scenarios](#dns-configuration-scenarios) - [Virtual network workloads without custom DNS server](#virtual-network-workloads-without-custom-dns-server)
purview Register Scan Power Bi Tenant Cross Tenant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-power-bi-tenant-cross-tenant.md
Title: Connect to and manage a Power BI tenant cross tenant
-description: This guide describes how to connect to a cross-tenant Power BI tenant in Microsoft Purview, and use Microsoft Purview's features to scan and manage your Power BI tenant source.
+ Title: Connect to and manage a Power BI tenant (cross-tenant)
+description: This guide describes how to connect to a Power BI tenant in a cross-tenant scenario. You use Microsoft Purview to scan and manage your Power BI tenant source.
Last updated 04/29/2022
-# Connect to and manage a Power BI tenant in Microsoft Purview (Cross Tenant)
+# Connect to and manage a Power BI tenant in Microsoft Purview (cross-tenant)
-This article outlines how to register a Power BI tenant in a cross-tenant scenario, and how to authenticate and interact with the tenant in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
+This article outlines how to register a Power BI tenant in a cross-tenant scenario, and how to authenticate and interact with the tenant in Microsoft Purview. If you're unfamiliar with the service, see [What is Microsoft Purview?](overview.md).
## Supported capabilities
-|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Access Policy**|**Lineage**|
+|**Metadata extraction**| **Full scan** |**Incremental scan**|**Scoped scan**|**Classification**|**Access policy**|**Lineage**|
|||||||| | [Yes](#deployment-checklist)| [Yes](#deployment-checklist)| Yes | No | No | No| [Yes](how-to-lineage-powerbi.md)| ### Supported scenarios for Power BI scans
-|**Scenarios** |**Microsoft Purview public access allowed/denied** |**Power BI public access allowed /denied** | **Runtime option** | **Authentication option** | **Deployment checklist** |
+|**Scenario** |**Microsoft Purview public access** |**Power BI public access** | **Runtime option** | **Authentication option** | **Deployment checklist** |
|||||||
-|Public access with Azure IR |Allowed |Allowed |Azure runtime |Delegated Authentication | [Deployment checklist](#deployment-checklist) |
-|Public access with Self-hosted IR |Allowed |Allowed |Self-hosted runtime |Delegated Authentication | [Deployment checklist](#deployment-checklist) |
+|Public access with Azure integration runtime |Allowed |Allowed |Azure runtime |Delegated authentication | [Deployment checklist](#deployment-checklist) |
+|Public access with self-hosted integration runtime |Allowed |Allowed |Self-hosted runtime |Delegated authentication | [Deployment checklist](#deployment-checklist) |
### Known limitations -- For cross-tenant scenario, delegated authentication is only supported option for scanning.-- You can create only one scan for a Power BI data source that is registered in your Microsoft Purview account.-- If Power BI dataset schema isn't shown after scan, it's due to one of the current limitations with [Power BI Metadata scanner](/power-bi/admin/service-admin-metadata-scanning).-- Empty workspaces are skipped.
+- For the cross-tenant scenario, delegated authentication is the only supported option for scanning.
+- You can create only one scan for a Power BI data source that is registered in your Microsoft Purview account.
+- If the Power BI dataset schema isn't shown after the scan, it's due to one of the current limitations with the [Power BI metadata scanner](/power-bi/admin/service-admin-metadata-scanning).
+- Empty workspaces are skipped.
## Prerequisites
-Before you start, make sure you have the following prerequisites:
+Before you start, make sure you have the following:
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - An active [Microsoft Purview account](create-catalog-portal.md).
-## Authentication options
--- Delegated Authentication- ## Deployment checklist
-Use any of the following deployment checklists during the setup or for troubleshooting purposes, based on your scenario:
-# [Public access with Azure IR](#tab/Scenario1)
+Use either of the following deployment checklists during the setup, or for troubleshooting purposes, based on your scenario.
+
+# [Public access with Azure integration runtime](#tab/Scenario1)
-### Scan cross-tenant Power BI using Azure IR and Delegated Authentication in public network
+### Scan cross-tenant Power BI by using delegated authentication in a public network
-1. Make sure Power BI and Microsoft Purview accounts are in cross-tenant.
+1. Make sure the Power BI and Microsoft Purview accounts are in the cross-tenant mode.
-1. Make sure Power BI tenant ID is entered correctly during the registration. By default, Power BI tenant ID that exists in the same Azure Active Directory as Microsoft Purview will be populated.
+1. Make sure the Power BI tenant ID is entered correctly during the registration. By default, the Power BI tenant ID that exists in the same Azure Active Directory (Azure AD) instance as Microsoft Purview will be populated.
-1. Make sure your [PowerBI Metadata model is up to date by enabling metadata scanning.](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning)
+1. Make sure your [Power BI metadata model is up to date by enabling metadata scanning](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning).
-1. From Azure portal, validate if Microsoft Purview account Network is set to public access.
+1. From the Azure portal, validate if the Microsoft Purview account network is set to **public access**.
-1. From Power BI tenant Admin Portal, make sure Power BI tenant is configured to allow public network.
+1. From the Power BI tenant admin portal, make sure the Power BI tenant is configured to allow a public network.
-1. Check your Azure Key Vault to make sure:
+1. Check your instance of Azure Key Vault to make sure:
1. There are no typos in the password.
- 2. Microsoft Purview Managed Identity has get/list access to secrets.
+ 2. Microsoft Purview managed identity has get and list access to secrets.
-1. Review your credential to validate:
- 1. Client ID matches _Application (Client) ID_ of the app registration.
- 2. Username includes the user principal name such as `johndoe@contoso.com`.
+1. Review your credential to validate that the:
+ 1. Client ID matches the _Application (Client) ID_ of the app registration.
+ 2. Username includes the user principal name, such as `johndoe@contoso.com`.
-1. In Power BI Azure AD tenant, validate Power BI admin user settings to make sure:
- 1. User is assigned to Power BI Administrator role.
+1. In the Power BI Azure AD tenant, validate the following Power BI admin user settings:
+ 1. The user is assigned to the Power BI administrator role.
2. At least one [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to the user.
- 3. If user is recently created, sign in with the user at least once to make sure password is reset successfully and user can successfully initiate the session.
- 4. There's no MFA or Conditional Access Policies are enforced on the user.
+ 3. If the user is recently created, sign in with the user at least once, to make sure that the password is reset successfully, and the user can successfully initiate the session.
+ 4. There are no multifactor authentication or conditional access policies enforced on the user.
-1. In Power BI Azure AD tenant, validate App registration settings to make sure:
- 1. App registration exists in your Azure Active Directory tenant where Power BI tenant is located.
- 2. Under **API permissions**, the following **delegated permissions** and **grant admin consent for the tenant** is set up with read for the following APIs:
+1. In the Power BI Azure AD tenant, validate the following app registration settings:
+ 1. The app registration exists in your Azure AD tenant where the Power BI tenant is located.
+ 2. Under **API permissions**, the following APIs are set up with **read** for **delegated permissions** and **grant admin consent for the tenant**:
1. Power BI Service Tenant.Read.All 2. Microsoft Graph openid 3. Microsoft Graph User.Read 3. Under **Authentication**:
- 1. **Supported account types**, **Accounts in any organizational directory (Any Azure AD directory - Multitenant)** is selected.
- 2. **Implicit grant and hybrid flows**, **ID tokens (used for implicit and hybrid flows)** is selected.
+ 1. **Supported account types** > **Accounts in any organizational directory (Any Azure AD directory - Multitenant)** is selected.
+ 2. **Implicit grant and hybrid flows** > **ID tokens (used for implicit and hybrid flows)** is selected.
3. **Allow public client flows** is enabled.
-# [Public access with Self-hosted IR](#tab/Scenario2)
-### Scan cross-tenant Power BI using self-hosted IR and Delegated Authentication in public network
+# [Public access with self-hosted integration runtime](#tab/Scenario2)
-1. Make sure Power BI and Microsoft Purview accounts are in cross-tenant.
+### Scan cross-tenant Power BI by using delegated authentication in a public network
-1. Make sure Power BI tenant ID is entered correctly during the registration.By default, Power BI tenant ID that exists in the same Azure Active Directory as Microsoft Purview will be populated.
+1. Make sure the Power BI and Microsoft Purview accounts are in the cross-tenant mode.
-1. Make sure your [PowerBI Metadata model is up to date by enabling metadata scanning.](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning)
+1. Make sure the Power BI tenant ID is entered correctly during the registration. By default, the Power BI tenant ID that exists in the same Azure Active Directory (Azure AD) instance as Microsoft Purview will be populated.
-1. From Azure portal, validate if Microsoft Purview account Network is set to public access.
+1. Make sure your [Power BI metadata model is up to date by enabling metadata scanning](/power-bi/admin/service-admin-metadata-scanning-setup#enable-tenant-settings-for-metadata-scanning).
-1. From Power BI tenant Admin Portal, make sure Power BI tenant is configured to allow public network.
+1. From the Azure portal, validate if the Microsoft Purview account nNetwork is set to **public access**.
-1. Check your Azure Key Vault to make sure:
+1. From the Power BI tenant admin portal, make sure the Power BI tenant is configured to allow a public network.
+
+1. Check your instance of Azure Key Vault to make sure:
1. There are no typos in the password.
- 2. Microsoft Purview Managed Identity has get/list access to secrets.
+ 2. Microsoft Purview managed identity has get and list access to secrets.
-1. Review your credential to validate:
- 1. Client ID matches _Application (Client) ID_ of the app registration.
- 2. Username includes the user principal name such as `johndoe@contoso.com`.
+1. Review your credential to validate that the:
+ 1. Client ID matches the _Application (Client) ID_ of the app registration.
+ 2. Username includes the user principal name, such as `johndoe@contoso.com`.
-1. In Power BI Azure AD tenant, validate Power BI admin user settings to make sure:
- 1. User is assigned to Power BI Administrator role.
+1. In the Power BI Azure AD tenant, validate the following Power BI admin user settings:
+ 1. The user is assigned to the Power BI administrator role.
2. At least one [Power BI license](/power-bi/admin/service-admin-licensing-organization#subscription-license-types) is assigned to the user.
- 3. If user is recently created, sign in with the user at least once to make sure password is reset successfully and user can successfully initiate the session.
- 4. There's no MFA or Conditional Access Policies are enforced on the user.
+ 3. If the user is recently created, sign in with the user at least once, to make sure that the password is reset successfully, and the user can successfully initiate the session.
+ 4. There are no multifactor authentication or conditional access policies enforced on the user.
-1. In Power BI Azure AD tenant, validate App registration settings to make sure:
- 5. App registration exists in your Azure Active Directory tenant where Power BI tenant is located.
- 6. Under **API permissions**, the following **delegated permissions** and **grant admin consent for the tenant** is set up with read for the following APIs:
+1. In the Power BI Azure AD tenant, validate the following app registration settings:
+ 1. The app registration exists in your Azure AD tenant where the Power BI tenant is located.
+ 2. Under **API permissions**, the following APIs are set up with **read** for **delegated permissions** and **grant admin consent for the tenant**:
1. Power BI Service Tenant.Read.All 2. Microsoft Graph openid 3. Microsoft Graph User.Read
- 7. Under **Authentication**:
- 1. **Supported account types**, **Accounts in any organizational directory (Any Azure AD directory - Multitenant)** is selected.
- 2. **Implicit grant and hybrid flows**, **ID tokens (used for implicit and hybrid flows)** is selected.
+ 3. Under **Authentication**:
+ 1. **Supported account types** > **Accounts in any organizational directory (Any Azure AD directory - Multitenant)** is selected.
+ 2. **Implicit grant and hybrid flows** > **ID tokens (used for implicit and hybrid flows)** is selected.
3. **Allow public client flows** is enabled.
-1. Validate Self-hosted runtime settings:
- 8. Latest version of [Self-hosted runtime](https://www.microsoft.com/download/details.aspx?id=39717) is installed on the VM.
- 9. Network connectivity from Self-hosted runtime to Power BI tenant is enabled.
- 10. Network connectivity from Self-hosted runtime to Microsoft services is enabled.
- 11. [JDK 8 or later](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html) is installed.
+1. Validate the following self-hosted runtime settings:
+ 1. The latest version of the [self-hosted runtime](https://www.microsoft.com/download/details.aspx?id=39717) is installed on the VM.
+ 1. Network connectivity from the self-hosted runtime to the Power BI tenant is enabled.
+ 1. Network connectivity from the self-hosted runtime to Microsoft services is enabled.
+ 1. [JDK 8 or later](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html) is installed.
-## Register Power BI tenant
+## Register the Power BI tenant
-1. Select the **Data Map** on the left navigation.
+1. From the options on the left, select **Data Map**.
-1. Then select **Register**.
+1. Select **Register**, and then select **Power BI** as your data source.
- Select **Power BI** as your data source.
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/select-power-bi-data-source.png" alt-text="Screenshot that shows the list of data sources available to choose.":::
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/select-power-bi-data-source.png" alt-text="Image showing the list of data sources available to choose.":::
+1. Give your Power BI instance a friendly name. The name must be 3 to 63 characters long, and must contain only letters, numbers, underscores, and hyphens. Spaces aren't allowed.
-1. Give your Power BI instance a friendly name. The name must be between 3-63 characters long and must contain only letters, numbers, underscores, and hyphens. Spaces aren't allowed.
+1. Edit the **Tenant ID** field, to replace with the cross-tenant Power BI that you want to register and scan. By default, the Power BI tenant ID that exists in the same Azure AD instance as Microsoft Purview is populated.
-1. Edit the Tenant ID field to replace with cross Power BI tenant you want to register and scan. By default, Power BI tenant ID that exists in the same Azure Active Directory as Microsoft Purview will be populated.
-
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/register-cross-tenant.png" alt-text="Image showing the registration experience for cross tenant Power BI":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/register-cross-tenant.png" alt-text="Screenshot that shows the registration experience for cross-tenant Power BI.":::
## Scan cross-tenant Power BI
+Delegated authentication is the only supported option for cross-tenant scanning. You can use either Azure runtime or a self-hosted integration runtime to run a scan.
+ > [!TIP] > To troubleshoot any issues with scanning:
-> 1. Confirm you have completed the [**deployment checklist for your scenario**](#deployment-checklist).
-> 1. Review our [**scan troubleshooting documentation**](register-scan-power-bi-tenant-troubleshoot.md).
-
-### Scan cross-tenant Power BI using Delegated authentication
-
-Delegated authentication is the only supported option for cross-tenant scan option, however, you can use either Azure runtime or a self-hosted integration runtime to run a scan.
+> 1. Confirm you have completed the [deployment checklist for your scenario](#deployment-checklist).
+> 1. Review the [scan troubleshooting documentation](register-scan-power-bi-tenant-troubleshoot.md).
-To create and run a new scan using Azure runtime, perform the following steps:
+To create and run a new scan by using the Azure runtime, perform the following steps:
-1. Create a user account in Azure Active Directory tenant where Power BI tenant is located and assign the user to Azure Active Directory role, **Power BI Administrator**. Take note of username and sign in to change the password.
+1. Create a user account in the Azure AD tenant where the Power BI tenant is located, and assign the user to the Azure AD role, **Power BI Administrator**. Take note of the username and sign in to change the password.
-2. Assign proper Power BI license to the user.
+1. Assign the proper Power BI license to the user.
-2. Navigate to your Azure key vault in the tenant where Microsoft Purview is created.
+1. Go to the instance of Azure Key Vault in the tenant where Microsoft Purview is created.
-3. Select **Settings** > **Secrets** and select **+ Generate/Import**.
+1. Select **Settings** > **Secrets**, and then select **+ Generate/Import**.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot how to navigate to Azure Key Vault.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault.png" alt-text="Screenshot of the instance of Azure Key Vault.":::
-5. Enter a name for the secret and for **Value**, type the newly created password for the Azure AD user. Select **Create** to complete.
+1. Enter a name for the secret. For **Value**, type the newly created password for the Azure AD user. Select **Create** to complete.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret.png" alt-text="Screenshot how to generate an Azure Key Vault secret.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-key-vault-secret.png" alt-text="Screenshot that shows how to generate a secret in Azure Key Vault.":::
-6. If your key vault isn't connected to Microsoft Purview yet, you'll need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account)
+1. If your key vault isn't connected to Microsoft Purview yet, you need to [create a new key vault connection](manage-credentials.md#create-azure-key-vaults-connections-in-your-microsoft-purview-account).
-7. Create an App Registration in your Azure Active Directory tenant where Power BI is located. Provide a web URL in the **Redirect URI**. Take note of Client ID(App ID).
+1. Create an app registration in your Azure AD tenant where Power BI is located. Provide a web URL in the **Redirect URI**. Take note of the client ID (app ID).
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-create-service-principle.png" alt-text="Screenshot how to create a Service Principle.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-create-service-principle.png" alt-text="Screenshot that shows how to create a service principle.":::
-8. From Azure Active Directory dashboard, select newly created application and then select **App permissions**. Assign the application the following delegated permissions and grant admin consent for the tenant:
+1. From the Azure AD dashboard, select the newly created application, and then select **App permissions**. Assign the application the following delegated permissions, and grant admin consent for the tenant:
- Power BI Service Tenant.Read.All - Microsoft Graph openid - Microsoft Graph User.Read
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-delegated-permissions.png" alt-text="Screenshot of delegated permissions for Power BI Service and Microsoft Graph.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-delegated-permissions.png" alt-text="Screenshot of delegated permissions for Power BI and Microsoft Graph.":::
-9. From Azure Active Directory dashboard, select newly created application and then select **Authentication**. Under **Supported account types** select **Accounts in any organizational directory (Any Azure AD directory - Multitenant)**.
+1. From the Azure AD dashboard, select the newly created application, and then select **Authentication**. Under **Supported account types**, select **Accounts in any organizational directory (Any Azure AD directory - Multitenant)**.
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-multitenant.png" alt-text="Screenshot of account type support multitenant.":::
-10. Under **Implicit grant and hybrid flows**, ensure to select **ID tokens (used for implicit and hybrid flows)**
+1. Under **Implicit grant and hybrid flows**, select **ID tokens (used for implicit and hybrid flows)**.
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-id-token-hybrid-flows.png" alt-text="Screenshot of ID token hybrid flows.":::
-11. Under **Advanced settings**, enable **Allow Public client flows**.
+1. Under **Advanced settings**, enable **Allow Public client flows**.
-12. In the Microsoft Purview Studio, navigate to the **Data map** in the left menu. Navigate to **Sources**.
+1. In the Microsoft Purview Studio, go to the **Data map** in the left menu. Go to **Sources**.
-13. Select the registered Power BI source from cross tenant.
+1. Select the registered Power BI source from cross-tenant.
-14. Select **+ New scan**.
+1. Select **+ New scan**.
-15. Give your scan a name. Then select the option to include or exclude the personal workspaces.
+1. Give your scan a name. Then select the option to include or exclude the personal workspaces.
> [!Note]
- > Switching the configuration of a scan to include or exclude a personal workspace will trigger a full scan of PowerBI source.
+ > If you switch the configuration of a scan to include or exclude a personal workspace, you trigger a full scan of the PowerBI source.
-16. Select **Azure AutoResolveIntegrationRuntime** from the drop-down list.
+1. Select **Azure AutoResolveIntegrationRuntime** from the dropdown list.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant.png" alt-text="Image showing Power BI scan setup using Azure IR for cross tenant.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant.png" alt-text="Screenshot that shows the Power BI scan setup, using Azure integration runtime for cross-tenant.":::
-17. For the **Credential**, select **Delegated authentication** and select **+ New** to create a new credential.
+1. For the **Credential**, select **Delegated authentication**, and then select **+ New** to create a new credential.
-18. Create a new credential and provide required parameters:
+1. Create a new credential and provide the following required parameters:
- - **Name**: Provide a unique name for credential.
+ - **Name**: Provide a unique name for the credential.
- - **Client ID**: Use Service Principal Client ID (App ID) you created earlier.
+ - **Client ID**: Use the service principal client ID (app ID) that you created earlier.
- - **User name**: Provide the username of Power BI Administrator you created earlier.
+ - **User name**: Provide the username of the Power BI administrator that you created earlier.
- - **Password**: Select the appropriate Key vault connection and the **Secret name** where the Power BI account password was saved earlier.
+ - **Password**: Select the appropriate Key Vault connection, and the **Secret name** where the Power BI account password was saved earlier.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-delegated-authentication.png" alt-text="Image showing Power BI scan setup using Delegated authentication.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-delegated-authentication.png" alt-text="Screenshot that shows the Power BI scan setup, using delegated authentication.":::
-19. Select **Test Connection** before continuing to next steps.
+1. Select **Test connection** before continuing to the next steps.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant-test.png" alt-text="Screenshot of test connection status.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/power-bi-scan-cross-tenant-test.png" alt-text="Screenshot that shows the test connection status.":::
- If **Test Connection** failed, select **View Report** to see the detailed status and troubleshoot the problem:
+ If the test fails, select **View Report** to see the detailed status and troubleshoot the problem:
- 1. Access - Failed status means the user authentication failed: Validate if username and password are correct. review if the Credential contains correct Client (App) ID from the App Registration.
- 2. Assets (+ lineage) - Failed status means the Microsoft Purview - Power BI authorization has failed. Make sure the user is added to Power BI Administrator role and has proper Power BI license assigned to.
- 3. Detailed metadata (Enhanced) - Failed status means the Power BI admin portal is disabled for the following setting - **Enhance admin APIs responses with detailed metadata**
+ 1. *Access - Failed* status means that the user authentication failed. Validate if the username and password are correct. Review if the credential contains the correct client (app) ID from the app registration.
+ 2. *Assets (+ lineage) - Failed* status means that the authorization between Microsoft Purview and Power BI has failed. Make sure that the user is added to the Power BI administrator role, and has the proper Power BI license assigned.
+ 3. *Detailed metadata (Enhanced) - Failed* status means that the Power BI admin portal is disabled for the following setting: **Enhance admin APIs responses with detailed metadata**.
-20. Set up a scan trigger. Your options are **Recurring**, and **Once**.
+1. Set up a scan trigger. Your options are **Recurring** or **Once**.
:::image type="content" source="media/setup-power-bi-scan-catalog-portal/scan-trigger.png" alt-text="Screenshot of the Microsoft Purview scan scheduler.":::
-18. On **Review new scan**, select **Save and run** to launch your scan.
+1. On **Review new scan**, select **Save and run** to launch your scan.
- :::image type="content" source="media/setup-power-bi-scan-catalog-portal/save-run-power-bi-scan.png" alt-text="Screenshot of Save and run Power BI source.":::
+ :::image type="content" source="media/setup-power-bi-scan-catalog-portal/save-run-power-bi-scan.png" alt-text="Screenshot that shows how to save and run the Power BI source.":::
## Next steps
-Now that you've registered your source, follow the below guides to learn more about Microsoft Purview and your data.
+Now that you've registered your source, see the following guides to learn more about Microsoft Purview and your data.
-- [Data Estate Insights in Microsoft Purview](concept-insights.md)
+- [Data estate insights in Microsoft Purview](concept-insights.md)
- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)-- [Search Data Catalog](how-to-search-catalog.md)
+- [Search data catalog](how-to-search-catalog.md)
search Search Indexer Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-indexer-troubleshooting.md
When you are receiving any of those errors:
* Make sure your source is accessible by trying to connect to it directly and not through the search service * Check your source in the Azure portal for any current errors or outages * Check for any network outages in [Azure Status](https://status.azure.com/status)
-* Check you are using public DNS for name resolution and not an [Azure Private DNS](/dns/private-dns-overview)
+* Check you are using public DNS for name resolution and not an [Azure Private DNS](/azure/dns/private-dns-overview)
## Azure SQL Database serverless indexing (error code 40613)
sentinel Sap Solution Log Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/sap/sap-solution-log-reference.md
This article describes the functions, logs, and tables available as part of the
## Functions available from the SAP solution
-This section describes the [functions](/azure-monitor/logs/functions.md) that are available in your workspace after you've deployed the Continuous Threat Monitoring for SAP solution. Find these functions in the Microsoft Sentinel **Logs** page to use in your KQL queries, listed under **Workspace functions**.
+This section describes the [functions](/azure/azure-monitor/logs/functions) that are available in your workspace after you've deployed the Continuous Threat Monitoring for SAP solution. Find these functions in the Microsoft Sentinel **Logs** page to use in your KQL queries, listed under **Workspace functions**.
Users are *strongly encouraged* to use the functions as the subjects of their analysis whenever possible, instead of the underlying logs or tables. These functions are intended to serve as the principal user interface to the data. They form the basis for all the built-in analytics rules and workbooks available to you out of the box. This allows for changes to be made to the data infrastructure beneath the functions, without breaking user-created content.
storage Network File System Protocol Support Performance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/network-file-system-protocol-support-performance.md
Previously updated : 06/21/2021 Last updated : 06/03/2022
The read_ahead_kb kernel parameter represents the amount of additional data that
``` export AZMNT=/your/container/mountpoint
-echo 15728640 > /sys/class/bdi/0:$(stat -c "%d" $AZMNT)/read_ahead_kb
+echo 16300 > /sys/class/bdi/0:$(stat -c "%d" $AZMNT)/read_ahead_kb
``` ## Avoid frequent overwrites on data
storage Secure File Transfer Protocol Support How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/secure-file-transfer-protocol-support-how-to.md
Previously updated : 03/04/2022 Last updated : 06/03/2022
To learn more about SFTP support for Azure Blob Storage, see [SSH File Transfer
## Enable SFTP support
-This section shows you how to enable SFTP support for an existing storage account. To view an Azure Resource Manager template that enables SFTP support as part of creating the account, see [Create an Azure Storage Account and Blob Container accessible using SFTP protocol on Azure](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.storage/storage-sftp). To view the Local User REST APIs and .NET references, see [Local Users](https://docs.microsoft.com/rest/api/storagerp/local-users) and [LocalUser Class](https://docs.microsoft.com/dotnet/api/microsoft.azure.management.storage.models.localuser?view=azure-dotnet).
+This section shows you how to enable SFTP support for an existing storage account. To view an Azure Resource Manager template that enables SFTP support as part of creating the account, see [Create an Azure Storage Account and Blob Container accessible using SFTP protocol on Azure](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.storage/storage-sftp). To view the Local User REST APIs and .NET references, see [Local Users](/rest/api/storagerp/local-users) and [LocalUser Class](/dotnet/api/microsoft.azure.management.storage.models.localuser).
### [Portal](#tab/azure-portal)
To learn more about the SFTP permissions model, see [SFTP Permissions model](sec
> [!IMPORTANT] > While you can enable both forms of authentication, SFTP clients can connect by using only one of them. Multifactor authentication, whereby both a valid password and a valid public and private key pair are required for successful authentication is not supported.
- If you select **Secure with a password**, then your password will appear when you've completed all of the steps in the **Add local user** configuration pane.
+ If you select **SSH Password**, then your password will appear when you've completed all of the steps in the **Add local user** configuration pane.
- If you select **Secure with SSH public key**, then select **Add key source** to specify a key source.
+ If you select **SSH Key pair**, then select **Public key source** to specify a key source.
> [!div class="mx-imgBorder"] > ![Local user configuration pane](./media/secure-file-transfer-protocol-support-how-to/add-local-user-config-page.png)
To learn more about the SFTP permissions model, see [SFTP Permissions model](sec
If you want to use a password to authenticate the local user, you can generate one after the local user is created.
-3. If want to use an SSH key, create a public key object by using the **New-AzStorageLocalUserSshPublicKey** command. Set the `-Key` parameter to a string that contains the key type and public key. In the following example, the key type is `ssh-rsa` and the key is `ssh-rsa a2V5...`.
+3. If you want to use an SSH key, create a public key object by using the **New-AzStorageLocalUserSshPublicKey** command. Set the `-Key` parameter to a string that contains the key type and public key. In the following example, the key type is `ssh-rsa` and the key is `ssh-rsa a2V5...`.
```powershell $sshkey = "ssh-rsa a2V5..."
storage Storage Blob Download https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-download.md
public static async Task DownloadToStream(BlobClient blobClient, string localFil
The following example downloads a blob to a string. This example assumes that the blob is a text file. ```csharp
-public static async Task DownloadToText(BlobClient blobClient, string localFilePath)
+public static async Task DownloadToText(BlobClient blobClient)
{ BlobDownloadResult downloadResult = await blobClient.DownloadContentAsync(); string downloadedData = downloadResult.Content.ToString();
storage File Sync Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/file-sync/file-sync-monitoring.md
Alerts proactively notify you when important conditions are found in your monito
5. Fill in the **Alert details** like **Alert rule name**, **Description** and **Severity**. 6. Click **Create alert rule** to create the alert.
+ > [!Note]
+ > If you configure an alert using the Server Name dimension and the server is renamed, the alert will need to be updated to monitor the new server name.
+ The following table lists some example scenarios to monitor and the proper metric to use for the alert: | Scenario | Metric to use for alert |
synapse-analytics Security White Paper Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/guidance/security-white-paper-introduction.md
Azure Synapse is a Platform-as-a-service (PaaS) analytics service that brings to
[Pipelines](../../data-factory/concepts-pipelines-activities.md) are a logical grouping of activities that perform data movement and data transformation at scale. [Data flow](../../data-factory/concepts-data-flow-overview.md) is a transformation activity in a pipeline that's developed by using a low-code user interface. It can execute data transformations at scale. Behind the scenes, data flows use Apache Spark clusters of Azure Synapse to execute automatically generated code. Pipelines and data flows are compute-only services, and they don't have any managed storage associated with them.
-Pipelines use the Integration Runtime (IR) as the scalable compute infrastructure for performing data movement and dispatch activities. Data movement activities run on the IR whereas the dispatch activities run on variety of other compute engines, including Azure SQL Database, Azure HDInsight, Azure Databricks, Apache Spark clusters of Azure Synapse, and others. Azure Synapse supports two types of IR: Azure Integration Runtime and Self-hosted Integration Runtime. The [Azure IR](../../data-factory/concepts-integration-runtime.md#azure-integration-runtime) provides a fully managed, scalable, and on-demand compute infrastructure. The [Self-hosted IR](../../data-factory/concepts-integration-runtime.md#self-hosted-integration-runtime) is installed and configured by the customer in their own network, either in on-premises machines or in Azure cloud virtual machines.
+Pipelines use the Integration Runtime (IR) as the scalable compute infrastructure for performing data movement and dispatch activities. Data movement activities run on the IR whereas the dispatch activities run on variety of other compute engines, including Azure SQL Database, Azure HDInsight, Azure Databricks, Apache Spark clusters of Azure Synapse, and others. Azure Synapse supports two types of IR: Azure Integration Runtime and Self-hosted Integration Runtime. The [Azure IR](/azure/data-factory/concepts-integration-runtime#azure-integration-runtime) provides a fully managed, scalable, and on-demand compute infrastructure. The [Self-hosted IR](/azure/data-factory/concepts-integration-runtime#self-hosted-integration-runtime) is installed and configured by the customer in their own network, either in on-premises machines or in Azure cloud virtual machines.
Customers can choose to associate their Synapse workspace with a [managed workspace virtual network](../security/synapse-workspace-managed-vnet.md). When associated with a managed workspace virtual network, Azure IRs and Apache Spark clusters that are used by pipelines, data flows, and the Apache Spark pools are deployed inside the managed workspace virtual network. This setup ensures network isolation between the workspaces for pipelines and Apache Spark workloads.
Azure Synapse implements a multi-layered security architecture for end-to-end pr
## Next steps
-In the [next article](security-white-paper-data-protection.md) in this white paper series, learn about data protection.
+In the [next article](security-white-paper-data-protection.md) in this white paper series, learn about data protection.
synapse-analytics Connect Synapse Link Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/synapse-link/connect-synapse-link-sql-database.md
This article provides a step-by-step guide for getting started with Azure Synaps
1. Select **Create**. > [!NOTE]
- > The linked service that you create here is not dedicated to Azure Synapse Link for SQL - it can be used by any workspace user that has the appropriate permissions. Please take time to understand the scope of users who may have access to this linked service and its credentials. For more information on permissions in Azure Synapse workspaces, see [Azure Synapse workspace access control overview - Azure Synapse Analytics](/synapse-analytics/security/synapse-workspace-access-control-overview).
+ > The linked service that you create here is not dedicated to Azure Synapse Link for SQL - it can be used by any workspace user that has the appropriate permissions. Please take time to understand the scope of users who may have access to this linked service and its credentials. For more information on permissions in Azure Synapse workspaces, see [Azure Synapse workspace access control overview - Azure Synapse Analytics](/azure/synapse-analytics/security/synapse-workspace-access-control-overview).
1. Select one or more source tables to replicate to your Synapse workspace and select **Continue**.
synapse-analytics Connect Synapse Link Sql Server 2022 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/synapse-link/connect-synapse-link-sql-server-2022.md
This article provides a step-by-step guide for getting started with Azure Synaps
:::image type="content" source="../media/connect-synapse-link-sql-server-2022/view-linked-service-connection.png" alt-text="View the linked service connection."::: > [!NOTE]
- > The linked service that you create here is not dedicated to Azure Synapse Link for SQL - it can be used by any workspace user that has the appropriate permissions. Please take time to understand the scope of users who may have access to this linked service and its credentials. For more information on permissions in Azure Synapse workspaces, see [Azure Synapse workspace access control overview - Azure Synapse Analytics](/synapse-analytics/security/synapse-workspace-access-control-overview).
+ > The linked service that you create here is not dedicated to Azure Synapse Link for SQL - it can be used by any workspace user that has the appropriate permissions. Please take time to understand the scope of users who may have access to this linked service and its credentials. For more information on permissions in Azure Synapse workspaces, see [Azure Synapse workspace access control overview - Azure Synapse Analytics](/azure/synapse-analytics/security/synapse-workspace-access-control-overview).
## Create linked service to connect to your landing zone on Azure Data Lake Storage Gen2
This article provides a step-by-step guide for getting started with Azure Synaps
:::image type="content" source="../media/connect-synapse-link-sql-server-2022/storage-gen2-linked-service-created.png" alt-text="New linked service to Azure Data Lake Storage Gen2."::: > [!NOTE]
- > The linked service that you create here is not dedicated to Azure Synapse Link for SQL - it can be used by any workspace user that has the appropriate permissions. Please take time to understand the scope of users who may have access to this linked service and its credentials. For more information on permissions in Azure Synapse workspaces, see [Azure Synapse workspace access control overview - Azure Synapse Analytics](/synapse-analytics/security/synapse-workspace-access-control-overview).
+ > The linked service that you create here is not dedicated to Azure Synapse Link for SQL - it can be used by any workspace user that has the appropriate permissions. Please take time to understand the scope of users who may have access to this linked service and its credentials. For more information on permissions in Azure Synapse workspaces, see [Azure Synapse workspace access control overview - Azure Synapse Analytics](/azure/synapse-analytics/security/synapse-workspace-access-control-overview).
## Create the Azure Synapse Link connection
synapse-analytics Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/whats-new.md
The following updates are new to Azure Synapse Analytics this month.
* Assigning parameters dynamically based on variables, metadata, or specifying Pipeline specific parameters has been one of your top feature requests. Now, with the release of parameterization for the Spark job definition activity, you can do just that. For more details, read [Transform data using Apache Spark job definition](quickstart-transform-data-using-spark-job-definition.md#settings-tab).
-* We often receive customer requests to access the snapshot of the Notebook when there is a Pipeline Notebook run failure or there is a long-running Notebook job. With the release of the Synapse Notebook snapshot feature, you can now view the snapshot of the Notebook activity run with the original Notebook code, the cell output, and the input parameters. You can also access the snapshot of the referenced Notebook from the referencing Notebook cell output if you refer to other Notebooks through Spark utils. To learn more, read [Transform data by running a Synapse notebook](synapse-notebook-activity.md?tabs=classical#see-notebook-activity-run-history) and [Introduction to Microsoft Spark utilities](/spark/microsoft-spark-utilities.md?pivots=programming-language-scala#reference-a-notebook-1).
+* We often receive customer requests to access the snapshot of the Notebook when there is a Pipeline Notebook run failure or there is a long-running Notebook job. With the release of the Synapse Notebook snapshot feature, you can now view the snapshot of the Notebook activity run with the original Notebook code, the cell output, and the input parameters. You can also access the snapshot of the referenced Notebook from the referencing Notebook cell output if you refer to other Notebooks through Spark utils. To learn more, read [Transform data by running a Synapse notebook](synapse-notebook-activity.md?tabs=classical#see-notebook-activity-run-history) and [Introduction to Microsoft Spark utilities](/azure/synapse-analytics/spark/microsoft-spark-utilities?pivots=programming-language-scala#reference-a-notebook-1).
## Security
virtual-machines Run Scripts In Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/run-scripts-in-vm.md
The [Custom Script Extension](../extensions/custom-script-linux.md) is primarily
The [Run Command](run-command.md) feature enables virtual machine and application management and troubleshooting using scripts, and is available even when the machine is not reachable, for example if the guest firewall doesn't have the RDP or SSH port open. * Run scripts in Azure virtual machines.
-* Can be run using [Azure portal](run-command.md), [REST API](/rest/api/compute/virtual-machines-run-commands/run-command), [Azure CLI](/cli/azure/vm/run-command#az-vm-run-command-invoke), or [PowerShell](/powershell/module/az.compute/invoke-azvmruncommand)
+* Can be run using [Azure portal](run-command.md), [REST API](/azure/virtual-machines/windows/run-command), [Azure CLI](/cli/azure/vm/run-command#az-vm-run-command-invoke), or [PowerShell](/powershell/module/az.compute/invoke-azvmruncommand)
* Quickly run a script and view output and repeat as needed in the Azure portal. * Script can be typed directly or you can run one of the built-in scripts. * Run PowerShell script in Windows machines and Bash script in Linux machines.
virtual-machines Vm Applications How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/vm-applications-how-to.md
Before you get started, make sure you have the following:
This article assumes you already have an Azure Compute Gallery. If you don't already have a gallery, create one first. To learn more, see [Create a gallery for storing and sharing resources](create-gallery.md)..
-You should have uploaded your application to a container in an Azure storage account. Your application can be stored in a block or page blob. If you choose to use a page blob, you need to byte align the files before you upload them. Here is a sample that will byte align your file:
+You should have uploaded your application to a container in an [Azure storage account](../storage/common/storage-account-create.md). Your application can be stored in a block or page blob. If you choose to use a page blob, you need to byte align the files before you upload them. Here's a sample that will byte align your file:
```azurepowershell-interactive $inputFile = <the file you want to pad>
if ($remainder -ne 0){
} ```
-You need to make sure the files are publicly available, or you will need the SAS URI for the files in your storage account. You can use [Storage Explorer](../vs-azure-tools-storage-explorer-blobs.md) to quickly create a SAS URI if you don't already have one.
+You need to make sure the files are publicly available, or you'll need the SAS URI for the files in your storage account. You can use [Storage Explorer](../vs-azure-tools-storage-explorer-blobs.md) to quickly create a SAS URI if you don't already have one.
-If you are using PowerShell, you need to be using version 3.11.0 of the Az.Storage module.
+If you're using PowerShell, you need to be using version 3.11.0 of the Az.Storage module.
## Create the VM application
Choose an option below for creating your VM application definition and version:
- Link to a Eula - URI of a privacy statement - URI for release notes
-1. When you are done, select **Review + create**.
+1. When you're done, select **Review + create**.
1. When validation completes, select **Create** to have the definition deployed. 1. Once the deployment is complete, select **Go to resource**. 1. On the page for the application, select **Create a VM application version**. The **Create a VM Application Version** page will open. 1. Enter a version number like 1.0.0.
-1. Select the region where you have uploaded your application package.
-1. Under **Source application package**, select **Browse**. Select the storage account, then the container where your package is located. Select the package from the list and then click **Select** when you are done.
+1. Select the region where you've uploaded your application package.
+1. Under **Source application package**, select **Browse**. Select the storage account, then the container where your package is located. Select the package from the list and then click **Select** when you're done.
1. Type in the **Install script**. You can also provide the **Uninstall script** and **Update script**. See the [Overview](vm-applications.md#command-interpreter) for information on how to create the scripts. 1. If you have a default configuration file uploaded to a storage account, you can select it in **Default configuration**.
-1. Select **Exclude from latest** if you do not want this version to appear as the latest version when you create a VM.
-1. For **End of life date**, choose a date in the future to track when this version should be retired. It is not deleted or removed automatically, it is only for your own tracking.
-1. To replicate this version to other regions, select the **Replication** tab and add more regions and make changes to the number of replicas per region. The original region where your version was created must be in the list and cannot be removed.
-1. When you are done making changes, select **Review + create** at the bottom of the page.
+1. Select **Exclude from latest** if you don't want this version to appear as the latest version when you create a VM.
+1. For **End of life date**, choose a date in the future to track when this version should be retired. It isn't deleted or removed automatically, it's only for your own tracking.
+1. To replicate this version to other regions, select the **Replication** tab and add more regions and make changes to the number of replicas per region. The original region where your version was created must be in the list and can't be removed.
+1. When you're done making changes, select **Review + create** at the bottom of the page.
1. When validation shows as passed, select **Create** to deploy your VM application version.
Select the VM application from the list, and then select **Save** at the bottom
:::image type="content" source="media/vmapps/select-app.png" alt-text="Screenshot showing selecting a VM application to install on the VM.":::
-If you have more than one VM application to install, you can set the install order for each VM application back on the **Advanced tab**.
+If you've more than one VM application to install, you can set the install order for each VM application back on the **Advanced tab**.
+
+You can also deploy the VM application to currently running VMs. Select the **Extensions + applications** option under **Settings** in the left menu when viewing the VM details in the portal.
+
+Choose **VM applications** and then select **Add application** to add your VM application.
++
+Select the VM application from the list, and then select **Save** at the bottom of the page.
+ ### [CLI](#tab/cli) VM applications require [Azure CLI](/cli/azure/install-azure-cli) version 2.30.0 or later.
-Crate the VM application definition using [az sig gallery-application create](/cli/azure/sig/gallery-application#az-sig-gallery-application-create). In this example we are creating a VM application definition named *myApp* for Linux-based VMs.
+Create the VM application definition using [az sig gallery-application create](/cli/azure/sig/gallery-application#az_sig_gallery_application_create). In this example we're creating a VM application definition named *myApp* for Linux-based VMs.
+ ```azurecli-interactive az sig gallery-application create \
az sig gallery-application version create \
--package-file-link "https://<storage account name>.blob.core.windows.net/<container name>/<filename>" \ --install-command "mv myApp .\myApp\myApp" \ --remove-command "rm .\myApp\myApp" \
- --update-command "mv myApp .\myApp\myApp \
+ --update-command "mv myApp .\myApp\myApp" \
--default-configuration-file-link "https://<storage account name>.blob.core.windows.net/<container name>/<filename>"\ ```
+Set a VM application to an existing VM using [az vm application set](/cli/azure/vm/application#az-vm-application-set) and replace the values of the parameters with your own.
+```azurecli-interactive
+az vm application set \
+ --resource-group myResourceGroup \
+ --name myVM \
+--app-version-ids /subscriptions/{subID}/resourceGroups/MyResourceGroup/providers/Microsoft.Compute/galleries/myGallery/applications/myApp/versions/1.0.0 \
+```
### [PowerShell](#tab/powershell)
-Create the VM application definition using `New-AzGalleryApplication`. In this example, we are creating a Linux app named *myApp* in the *myGallery* Azure Compute Gallery, in the *myGallery* resource group and I've given a short description of the VM application for my own use. Replace the values as needed.
+Create the VM application definition using `New-AzGalleryApplication`. In this example, we're creating a Linux app named *myApp* in the *myGallery* Azure Compute Gallery, in the *myGallery* resource group and I've given a short description of the VM application for my own use. Replace the values as needed.
```azurepowershell-interactive
-$galleryName = myGallery
-$rgName = myResourceGroup
-$applicationName = myApp
+$galleryName = "myGallery"
+$rgName = "myResourceGroup"
+$applicationName = "myApp"
New-AzGalleryApplication ` -ResourceGroupName $rgName ` -GalleryName $galleryName `
New-AzGalleryApplication `
Create a version of your application using `New-AzGalleryApplicationVersion`. Allowed characters for version are numbers and periods. Numbers must be within the range of a 32-bit integer. Format: *MajorVersion*.*MinorVersion*.*Patch*.
-In this example, we are creating version number *1.0.0*. Replace the values of the variables as needed.
+In this example, we're creating version number *1.0.0*. Replace the values of the variables as needed.
```azurepowershell-interactive
-$version = 1.0.0
+$galleryName = "myGallery"
+$rgName = "myResourceGroup"
+$applicationName = "myApp"
+$version = "1.0.0"
New-AzGalleryApplicationVersion ` -ResourceGroupName $rgName ` -GalleryName $galleryName `
New-AzGalleryApplicationVersion `
-Name $version ` -PackageFileLink "https://<storage account name>.blob.core.windows.net/<container name>/<filename>" ` -Location "East US" `
- -Install myApp.exe /silent `
- -Remove myApp.exe /uninstall `
+ -Install "mv myApp .\myApp\myApp" `
+ -Remove "rm .\myApp\myApp" `
``` To add the application to an existing VM, get the application version and use that to get the VM application version ID. Use the ID to add the application to the VM configuration. ```azurepowershell-interactive
-$vmname = "myVM"
-$vm = Get-AzVM -ResourceGroupName $rgname -Name $vmname
+$galleryName = "myGallery"
+$rgName = "myResourceGroup"
+$applicationName = "myApp"
+$vmName = "myVM"
+$vm = Get-AzVM -ResourceGroupName $rgname -Name $vmName
$appversion = Get-AzGalleryApplicationVersion ` -GalleryApplicationName $applicationname ` -GalleryName $galleryname `
$appversion = Get-AzGalleryApplicationVersion `
-ResourceGroupName $rgname $packageid = $appversion.Id $app = New-AzVmGalleryApplication -PackageReferenceId $packageid
-Add-AzVmGalleryApplication -VM $vmname -GalleryApplication $app
+Add-AzVmGalleryApplication -VM $vm -GalleryApplication $app
Update-AzVM -ResourceGroupName $rgname -VM $vm ```
PUT
|--|--|--| | name | A unique name for the VM Application within the gallery | Max length of 117 characters. Allowed characters are uppercase or lowercase letters, digits, hyphen(-), period (.), underscore (_). Names not allowed to end with period(.). | | supportedOSType | Whether this is a Windows or Linux application | ΓÇ£WindowsΓÇ¥ or ΓÇ£LinuxΓÇ¥ |
-| endOfLifeDate | A future end of life date for the application. Note that this is for reference only, and is not enforced. | Valid future date |
+| endOfLifeDate | A future end of life date for the application. Note this is for reference only, and isn't enforced. | Valid future date |
Create a VM application version.
PUT
| Update | Optional. The command to update the application. If not specified and an update is required, the old version will be removed and the new one installed. | Valid command for the given OS | | targetRegions/name | The name of a region to which to replicate | Validate Azure region | | targetRegions/regionalReplicaCount | Optional. The number of replicas in the region to create. Defaults to 1. | Integer between 1 and 3 inclusive |
-| endOfLifeDate | A future end of life date for the application version. Note that this is for customer reference only, and is not enforced. | Valid future date |
-| excludeFromLatest | If specified, this version will not be considered for latest. | True or false |
+| endOfLifeDate | A future end of life date for the application version. Note this is for customer reference only, and isn't enforced. | Valid future date |
+| excludeFromLatest | If specified, this version won't be considered for latest. | True or false |
virtualMachineScaleSets/\<**VMSSName**\>?api-version=2019-03-01
| Field Name | Description | Limitations | |--|--|--| | order | Optional. The order in which the applications should be deployed. See below. | Validate integer |
-| packageReferenceId | A reference the the gallery application version | Valid application version reference |
+| packageReferenceId | A reference the gallery application version | Valid application version reference |
| configurationReference | Optional. The full url of a storage blob containing the configuration for this deployment. This will override any value provided for defaultConfiguration earlier. | Valid storage blob reference | The order field may be used to specify dependencies between applications. The rules for order are the following: | Case | Install Meaning | Failure Meaning | |--|--|--|
-| No order specified | Unordered applications are installed after ordered applications. There is no guarantee of installation order amongst the unordered applications. | Installation failures of other applications, be it ordered or unordered doesnΓÇÖt affect the installation of unordered applications. |
+| No order specified | Unordered applications are installed after ordered applications. There's no guarantee of installation order amongst the unordered applications. | Installation failures of other applications, be it ordered or unordered doesnΓÇÖt affect the installation of unordered applications. |
| Duplicate order values | Application will be installed in any order compared to other applications with the same order. All applications of the same order will be installed after those with lower orders and before those with higher orders. | If a previous application with a lower order failed to install, no applications with this order will install. If any application with this order fails to install, no applications with a higher order will install. |
-| Increasing orders | Application will be installed after those with lower orders and before those with higher orders. | If a previous application with a lower order failed to install, this application will not install. If this application fails to install, no application with a higher order will install. |
+| Increasing orders | Application will be installed after those with lower orders and before those with higher orders. | If a previous application with a lower order failed to install, this application won't install. If this application fails to install, no application with a higher order will install. |
The response will include the full VM model. The following are the relevant parts.
relevant parts.
```
-If the VM applications have not yet been installed on the VM, the value will be empty.
+If the VM applications haven't yet been installed on the VM, the value will be empty.
virtual-machines Vm Applications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/vm-applications.md
Application packages provide benefits over other deployment and packaging method
- Support for virtual machines, and both flexible and uniform scale sets + - If you have Network Security Group (NSG) rules applied on your VM or scale set, downloading the packages from an internet repository might not be possible. And with storage accounts, downloading packages onto locked-down VMs would require setting up private links. + ## What are VM app packages? The VM application packages use multiple resource types:
virtual-machines Run Scripts In Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/run-scripts-in-vm.md
The [Custom Script Extension](../extensions/custom-script-windows.md) is primari
The [Run Command](run-command.md) feature enables virtual machine and application management and troubleshooting using scripts, and is available even when the machine is not reachable, for example if the guest firewall doesn't have the RDP or SSH port open. * Run scripts in Azure virtual machines.
-* Can be run using [Azure portal](run-command.md), [REST API](/rest/api/compute/virtual-machines-run-commands/run-command), [Azure CLI](/cli/azure/vm/run-command#az-vm-run-command-invoke), or [PowerShell](/powershell/module/az.compute/invoke-azvmruncommand)
+* Can be run using [Azure portal](run-command.md), [REST API](/azure/virtual-machines/windows/run-command), [Azure CLI](/cli/azure/vm/run-command#az-vm-run-command-invoke), or [PowerShell](/powershell/module/az.compute/invoke-azvmruncommand)
* Quickly run a script and view output and repeat as needed in the Azure portal. * Script can be typed directly or you can run one of the built-in scripts. * Run PowerShell script in Windows machines and Bash script in Linux machines.