Updates from: 02/02/2022 06:50:11
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/whats-new-docs.md
Welcome to what's new in Azure Active Directory B2C documentation. This article lists new docs that have been added and those that have had significant updates in the last three months. To learn what's new with the B2C service, see [What's new in Azure Active Directory](../active-directory/fundamentals/whats-new.md).
+## January 2022
+
+### Updated articles
+
+- [Tutorial: Secure Hybrid Access to applications with Azure AD B2C and F5 BIG-IP](partner-f5.md)
+- [Set up a force password reset flow in Azure Active Directory B2C](force-password-reset.md)
+- [Boolean claims transformations](boolean-transformations.md)
+- [Date claims transformations](date-transformations.md)
+- [General claims transformations](general-transformations.md)
+- [Integer claims transformations](integer-transformations.md)
+- [JSON claims transformations](json-transformations.md)
+- [Define phone number claims transformations in Azure AD B2C](phone-number-claims-transformations.md)
+- [Social accounts claims transformations](social-transformations.md)
+- [String claims transformations](string-transformations.md)
+- [StringCollection claims transformations](stringcollection-transformations.md)
+- [Billing model for Azure Active Directory B2C](billing.md)
+- [Configure SAML identity provider options with Azure Active Directory B2C](identity-provider-generic-saml-options.md)
+- [About claim resolvers in Azure Active Directory B2C custom policies](claim-resolver-overview.md)
+- [Add AD FS as a SAML identity provider using custom policies in Azure Active Directory B2C](identity-provider-adfs-saml.md)
+ ## December 2021 ### New articles
active-directory-domain-services Manage Group Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/manage-group-policy.md
This article shows you how to install the Group Policy Management tools, then ed
If you are interested in server management strategy, including machines in Azure and [hybrid connected](../azure-arc/servers/overview.md),
-consider reading how to
-[convert Group Policy content](../governance/policy/how-to/guest-configuration-create-group-policy.md)
-to the
+consider reading about the
[guest configuration](../governance/policy/concepts/guest-configuration.md) feature of [Azure Policy](../governance/policy/overview.md).
active-directory Concept Authentication Passwordless https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/concept-authentication-passwordless.md
The following providers offer FIDO2 security keys of different form factors that
| IDmelon Technologies Inc. | ![y] | ![y]| ![y]| ![y]| ![n] | https://www.idmelon.com/#idmelon | | Kensington | ![y] | ![y]| ![n]| ![n]| ![n] | https://www.kensington.com/solutions/product-category/why-biometrics/ | | KONA I | ![y] | ![n]| ![y]| ![y]| ![n] | https://konai.com/business/security/fido |
-| NEOWAVE | ![n] | ![y]| ![y]| ![n]| ![n] | https://neowave.fr/en/products/fido-range/ |
+| NeoWave | ![n] | ![y]| ![y]| ![n]| ![n] | https://neowave.fr/en/products/fido-range/ |
| Nymi | ![y] | ![n]| ![y]| ![n]| ![n] | https://www.nymi.com/nymi-band |
+| Octatco | ![y] | ![y]| ![n]| ![n]| ![n] | https://octatco.com/ |
| OneSpan Inc. | ![n] | ![y]| ![n]| ![y]| ![n] | https://www.onespan.com/products/fido | | Thales Group | ![n] | ![y]| ![y]| ![n]| ![n] | https://cpl.thalesgroup.com/access-management/authenticators/fido-devices | | Thetis | ![y] | ![y]| ![y]| ![y]| ![n] | https://thetis.io/collections/fido2 |
active-directory V2 Oauth2 Client Creds Grant Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-oauth2-client-creds-grant-flow.md
This type of authorization is common for daemons and service accounts that need
In order to enable this ACL-based authorization pattern, Azure AD doesn't require that applications be authorized to get tokens for another application. Thus, app-only tokens can be issued without a `roles` claim. Applications that expose APIs must implement permission checks in order to accept tokens.
-If you'd like to prevent applications from getting role-less app-only access tokens for your application, [ensure that user assignment requirements are enabled for your app](../manage-apps/what-is-access-management.md#requiring-user-assignment-for-an-app). This will block users and applications without assigned roles from being able to get a token for this application.
+If you'd like to prevent applications from getting role-less app-only access tokens for your application, [ensure that assignment requirements are enabled for your app](../manage-apps/what-is-access-management.md#requiring-user-assignment-for-an-app). This will block users and applications without assigned roles from being able to get a token for this application.
### Application permissions
Instead of using ACLs, you can use APIs to expose a set of **application permiss
* Send mail as any user * Read directory data
-To use application permissions with your own API (as opposed to Microsoft Graph), you must first [expose the API](howto-add-app-roles-in-azure-ad-apps.md) by defining scopes in the API's app registration in the Azure portal. Then, [configure access to the API](howto-add-app-roles-in-azure-ad-apps.md#assign-app-roles-to-applications) by selecting those permissions in your client application's app registration. If you haven't exposed any scopes in your API's app registration, you won't be able to specify application permissions to that API in your client application's app registration in the Azure portal.
+To use app roles (application permissions) with your own API (as opposed to Microsoft Graph), you must first [expose the app roles](howto-add-app-roles-in-azure-ad-apps.md) in the API's app registration in the Azure portal. Then, [configure the required app roles](howto-add-app-roles-in-azure-ad-apps.md#assign-app-roles-to-applications) by selecting those permissions in your client application's app registration. If you haven't exposed any app roles in your API's app registration, you won't be able to specify application permissions to that API in your client application's app registration in the Azure portal.
-When authenticating as an application (as opposed to with a user), you can't use *delegated permissions* - scopes that are granted by a user - because there is no user for your app to act on behalf of. You must use application permissions, also known as roles, that are granted by an admin for the application or via pre-authorization by the web API.
+When authenticating as an application (as opposed to with a user), you can't use *delegated permissions* because there is no user for your app to act on behalf of. You must use application permissions, also known as app roles, that are granted by an admin or by the API's owner.
For more information about application permissions, see [Permissions and consent](v2-permissions-and-consent.md#permission-types).
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/whats-new-docs.md
Previously updated : 01/03/2022 Last updated : 02/01/2022
Welcome to what's new in the Microsoft identity platform documentation. This article lists new docs that have been added and those that have had significant updates in the last three months.
+## January 2022
+
+### New articles
+
+- [Access Azure AD protected resources from an app in Google Cloud (preview)](workload-identity-federation-create-trust-gcp.md)
+- [Quickstart: Acquire a token and call the Microsoft Graph API by using a console app's identity](console-app-quickstart.md)
+- [Quickstart: Acquire a token and call Microsoft Graph API from a desktop application](desktop-app-quickstart.md)
+- [Quickstart: Add sign-in with Microsoft to a web app](web-app-quickstart.md)
+- [Quickstart: Protect a web API with the Microsoft identity platform](web-api-quickstart.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from a mobile application](mobile-app-quickstart.md)
+
+### Updated articles
+
+- [Confidential client assertions](msal-net-client-assertions.md)
+- [Claims mapping policy type](reference-claims-mapping-policy-type.md)
+- [Configure an app to trust a GitHub repo (preview)](workload-identity-federation-create-trust-github.md)
+- [Configure an app to trust an external identity provider (preview)](workload-identity-federation-create-trust.md)
+- [Exchange a SAML token issued by AD FS for a Microsoft Graph access token](v2-saml-bearer-assertion.md)
+- [Logging in MSAL.js](msal-logging-js.md)
+- [Permissions and consent in the Microsoft identity platform](v2-permissions-and-consent.md)
+- [Quickstart: Acquire a token and call Microsoft Graph API from a Java console app using app's identity](quickstart-v2-java-daemon.md)
+- [Quickstart: Acquire a token and call Microsoft Graph API from a Python console app using app's identity](quickstart-v2-python-daemon.md)
+- [Quickstart: Add sign-in with Microsoft to a Java web app](quickstart-v2-java-webapp.md)
+- [Quickstart: Add sign-in with Microsoft to a Python web app](quickstart-v2-python-webapp.md)
+- [Quickstart: Add sign-in with Microsoft to an ASP.NET Core web app](quickstart-v2-aspnet-core-webapp.md)
+- [Quickstart: ASP.NET web app that signs in Azure AD users](quickstart-v2-aspnet-webapp.md)
+- [Quickstart: Get a token and call the Microsoft Graph API by using a console app's identity](quickstart-v2-netcore-daemon.md)
+- [Quickstart: Protect an ASP.NET Core web API with the Microsoft identity platform](quickstart-v2-aspnet-core-web-api.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from an Android app](quickstart-v2-android.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from an iOS or macOS app](quickstart-v2-ios.md)
+ ## December 2021 ### New articles
Welcome to what's new in the Microsoft identity platform documentation. This art
- [Token cache serialization in MSAL.NET](msal-net-token-cache-serialization.md) - [What's new for authentication?](reference-breaking-changes.md)
-## October 2021
-
-### New articles
--- [Configure an app to trust a GitHub repo (preview)](workload-identity-federation-create-trust-github.md)-- [Configure an app to trust an external identity provider (preview)](workload-identity-federation-create-trust.md)-- [Set up your application's Azure AD test environment](test-setup-environment.md)-- [Throttling and service limits to consider for testing](test-throttle-service-limits.md)-- [Workload identity federation (preview)](workload-identity-federation.md)-
-### Updated articles
--- [Considerations for using Xamarin iOS with MSAL.NET](msal-net-xamarin-ios-considerations.md)-- [Handle ITP in Safari and other browsers where third-party cookies are blocked](reference-third-party-cookies-spas.md)-- [Initialize client applications using MSAL.js](msal-js-initializing-client-applications.md)-- [Microsoft Graph API](microsoft-graph-intro.md)-- [Microsoft identity platform and the OAuth 2.0 client credentials flow](v2-oauth2-client-creds-grant-flow.md)-- [What's new for authentication?](reference-breaking-changes.md)
active-directory Groups Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-lifecycle.md
For information on how to download and install the Azure AD PowerShell cmdlets,
## Activity-based automatic renewal
-With Azure AD intelligence, groups are now automatically renewed based on whether they have been recently used. This feature eliminates the need for manual action by group owners, because it's based on user activity in groups across Microsoft 365 services like Outlook, SharePoint, or Teams. For example, if an owner or a group member does something like upload a document to SharePoint, visit a Teams channel, or send an email to the group in Outlook, the group is automatically renewed around 35 days before the group expires and the owner does not get any renewal notifications. The "All Company" group converted in Yammer Native Mode to a Microsoft 365 Group doesn't currently support this type of automatic renewal, and Yammer activities for that group aren't counted as activities.
+With Azure AD intelligence, groups are now automatically renewed based on whether they have been recently used. This feature eliminates the need for manual action by group owners, because it's based on user activity in groups across Microsoft 365 services like Outlook, SharePoint, Teams, or Yammer. For example, if an owner or a group member does something like upload a document to SharePoint, visit a Teams channel, send an email to the group in Outlook, or view a post in Yammer, the group is automatically renewed around 35 days before the group expires and the owner does not get any renewal notifications.
For example, consider an expiration policy that is set so that a group expires after 30 days of inactivity. However, to keep from sending an expiration email the day that group expiration is enabled (because there's no record activity yet), Azure AD first waits five days. If there is activity in those five days, the expiration policy works as expected. If there is no activity within five days, we send an expiration/renewal email. Of course, if the group was inactive for five days, an email was sent, and then the group was active, we will autorenew it and start the expiration period again.
The following user actions cause automatic group renewal:
- SharePoint: View, edit, download, move, share, or upload files - Outlook: Join group, read/write group message from group space, Like a message (in Outlook Web Access) - Teams: Visit a Teams channel
+- Yammer: View a post within a Yammer community or an interactive email in Outlook
### Auditing and reporting
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 01/28/2022 Last updated : 01/31/2022
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information last updated on January 28th, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
+>This information last updated on January 31st, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
><br/> | Product name | String ID | GUID | Service plans included | Service plans included (friendly names) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Dynamics 365 Enterprise Edition - Additional Portal (Qualified Offer) | CRM_ONLINE_PORTAL | a4bfb28e-becc-41b0-a454-ac680dc258d3 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>CRM_ONLINE_PORTAL (1d4e9cb1-708d-449c-9f71-943aa8ed1d6a) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Dynamics CRM Online - Portal Add-On (1d4e9cb1-708d-449c-9f71-943aa8ed1d6a) | | Dynamics 365 Field Service Viral Trial | Dynamics_365_Field_Service_Enterprise_viral_trial | 29fcd665-d8d1-4f34-8eed-3811e3fca7b3 | CUSTOMER_VOICE_DYN365_VIRAL_TRIAL (dbe07046-af68-4861-a20d-1c8cbda9194f)<br/>DYN365_FS_ENTERPRISE_VIRAL_TRIAL (20d1455b-72b2-4725-8354-a177845ab77d)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>POWER_APPS_DYN365_VIRAL_TRIAL (54b37829-818e-4e3c-a08a-3ea66ab9b45d)<br/>POWER_AUTOMATE_DYN365_VIRAL_TRIAL (81d4ecb8-0481-42fb-8868-51536c5aceeb) | Customer Voice for Dynamics 365 vTrial (dbe07046-af68-4861-a20d-1c8cbda9194f)<br/>Dynamics 365 Field Service Enterprise vTrial (20d1455b-72b2-4725-8354-a177845ab77d)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Power Apps for Dynamics 365 vTrial (54b37829-818e-4e3c-a08a-3ea66ab9b45d)<br/>Power Automate for Dynamics 365 vTrial (81d4ecb8-0481-42fb-8868-51536c5aceeb) | | Dynamics 365 Finance | DYN365_FINANCE | 55c9eb4e-c746-45b4-b255-9ab6b19d5c62 | DYN365_CDS_FINANCE (e95d7060-d4d9-400a-a2bd-a244bf0b609e)<br/>DYN365_REGULATORY_SERVICE (c7657ae3-c0b0-4eed-8c1d-6a7967bd9c65)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>D365_Finance (9f0e1b4e-9b33-4300-b451-b2c662cd4ff7)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba) | Common Data Service for Dynamics 365 Finance (e95d7060-d4d9-400a-a2bd-a244bf0b609e)<br/>Dynamics 365 for Finance and Operations, Enterprise edition - Regulatory Service (c7657ae3-c0b0-4eed-8c1d-6a7967bd9c65)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Dynamics 365 for Finance (9f0e1b4e-9b33-4300-b451-b2c662cd4ff7)<br/>Power Apps for Dynamics 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>Power Automate for Dynamics 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba) |
-| DYNAMICS 365 FOR CUSTOMER SERVICE ENTERPRISE EDITION | DYN365_ENTERPRISE_CUSTOMER_SERVICE | 749742bf-0d37-4158-a120-33567104deeb | DYN365_ENTERPRISE_CUSTOMER_SERVICE (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |MICROSOFT SOCIAL ENGAGEMENT - SERVICE DISCONTINUATION (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>PROJECT ONLINE ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>DYNAMICS 365 FOR CUSTOMER SERVICE (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |
+| Dynamics 365 for Customer Service Enterprise Edition | DYN365_ENTERPRISE_CUSTOMER_SERVICE | 749742bf-0d37-4158-a120-33567104deeb | D365_CSI_EMBED_CSEnterprise (5b1e5982-0e88-47bb-a95e-ae6085eda612)<br/>DYN365_ENTERPRISE_CUSTOMER_SERVICE (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Forms_Pro_Service (67bf4812-f90b-4db9-97e7-c0bbbf7b2d09)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72) | Dynamics 365 Customer Service Insights for CS Enterprise (5b1e5982-0e88-47bb-a95e-ae6085eda612)<br/>Dynamics 365 for Customer Service (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Dynamics 365 Customer Voice for Customer Service Enterprise (67bf4812-f90b-4db9-97e7-c0bbbf7b2d09)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Dynamics 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>Power Automate for Dynamics 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>Project Online Essentials (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>Retired - Microsoft Social Engagement (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72) |
| DYNAMICS 365 FOR FINANCIALS BUSINESS EDITION | DYN365_FINANCIALS_BUSINESS_SKU | cc13a803-544e-4464-b4e4-6d6169a138fa | DYN365_FINANCIALS_BUSINESS (920656a2-7dd8-4c83-97b6-a356414dbd36)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b) |FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>DYNAMICS 365 FOR FINANCIALS (920656a2-7dd8-4c83-97b6-a356414dbd36) | | DYNAMICS 365 FOR SALES AND CUSTOMER SERVICE ENTERPRISE EDITION | DYN365_ENTERPRISE_SALES_CUSTOMERSERVICE | 8edc2cf8-6438-4fa9-b6e3-aa1660c640cc | DYN365_ENTERPRISE_P1 (d56f3deb-50d8-465a-bedb-f079817ccac1)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |DYNAMICS 365 CUSTOMER ENGAGEMENT PLAN (d56f3deb-50d8-465a-bedb-f079817ccac1)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>MICROSOFT SOCIAL ENGAGEMENT - SERVICE DISCONTINUATION (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT ONLINE ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014) | | DYNAMICS 365 FOR SALES ENTERPRISE EDITION | DYN365_ENTERPRISE_SALES | 1e1a282c-9c54-43a2-9310-98ef728faace | DYN365_ENTERPRISE_SALES (2da8e897-7791-486b-b08f-cc63c8129df7)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014) | DYNAMICS 365 FOR SALES (2da8e897-7791-486b-b08f-cc63c8129df7)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>MICROSOFT SOCIAL ENGAGEMENT - SERVICE DISCONTINUATION (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT ONLINE ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |
active-directory Service Accounts Governing Azure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/service-accounts-governing-azure.md
There are three types of service accounts in Azure Active Directory (Azure AD): [managed identities](service-accounts-managed-identities.md), [service principals](service-accounts-principal.md), and user accounts employed as service accounts. As you create these service accounts for automated use, they're granted permissions to access resources in Azure and Azure AD. Resources can include Microsoft 365 services, software as a service (SaaS) applications, custom applications, databases, HR systems, and so on. Governing Azure AD service accounts means that you manage their creation, permissions, and lifecycle to ensure security and continuity. > [!IMPORTANT]
-> We do not recommend using user accounts as service accounts as they are inherently less secure. This includes on-premises service accounts that are synced to Azure AD, as they are not converted to service principals. Instead, we recommend the use of managed identities or service principals. Note that at this time the use of conditional access policies is not possible with service principals, but the functionality is coming.
+> We do not recommend using user accounts as service accounts as they are inherently less secure. This includes on-premises service accounts that are synced to Azure AD, as they are not converted to service principals. Instead, we recommend the use of managed identities or service principals. Note that at this time the use of conditional access policies with service principals is called Conditional Access for workload identities and it's in public preview.
## Plan your service account
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/reference-connect-version-history.md
ms.assetid: ef2797d7-d440-4a9a-a648-db32ad137494
Previously updated : 10/21/2021 Last updated : 1/31/2022
Topic | Details
Steps to upgrade from Azure AD Connect | Different methods to [upgrade from a previous version to the latest](how-to-upgrade-previous-version.md) Azure AD Connect release. Required permissions | For permissions required to apply an update, see [Azure AD Connect: Accounts and permissions](reference-connect-accounts-permissions.md#upgrade).
+## Retiring Azure AD Connect 1.x versions
> [!IMPORTANT] > *On August 31, 2022, all 1.x versions of Azure AD Connect will be retired because they include SQL Server 2012 components that will no longer be supported.* Upgrade to the most recent version of Azure AD Connect (2.x version) by that date or [evaluate and switch to Azure AD cloud sync](../cloud-sync/what-is-cloud-sync.md).
-Make sure you're running a recent version of Azure AD Connect to receive an optimal support experience.
+## Retiring Azure AD Connect 2.x versions
+> [!IMPORTANT]
+> We will begin retiring past versions of Azure AD Connect Sync 2.x 12 months from the date they are superseded by a newer version.
+> This policy will go into effect on 15 March 2023, when we will retire all versions that are superseded by a newer version on 15 March 2022.
+>
+> The following versions will retire on 15 March 2023:
+>
+> - 2.0.89.0
+> - 2.0.88.0
+> - 2.0.28.0
+> - 2.0.25.1
+> - 2.0.10.0
+> - 2.0.9.0
+> - 2.0.8.0
+> - 2.0.3.0
+>
+> If you are not already using the latest release version of Azure AD Connect Sync, you should upgrade your Azure AD Connect Sync software before that date.
+>
+> This policy does not change the retirement of all 1.x versions of Azure AD Connect Sync on 31 August 2022, which is due to the retirement of the SQL Server 2012 and Azure AD Authentication Library (ADAL) components.
If you run a retired version of Azure AD Connect, it might unexpectedly stop working. You also might not have the latest security fixes, performance improvements, troubleshooting and diagnostic tools, and service enhancements. If you require support, we might not be able to provide you with the level of service your organization needs.
active-directory Application Sign In Other Problem Access Panel https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/application-sign-in-other-problem-access-panel.md
Previously updated : 07/11/2017 Last updated : 02/01/2022
To check if you have the correct deep link, follow these steps:
4. Select **Enterprise Applications** from the Azure Active Directory left-hand navigation menu. 5. Select **All Applications** to view a list of all your applications. - If you do not see the application you want show up here, use the **Filter** control at the top of the **All Applications List** and set the **Show** option to **All Applications.**
-6. Open the [**Azure portal**](https://portal.azure.com/) and sign in as a **Global Administrator** or **Co-admin.**
-7. Open the **Azure Active Directory Extension** by selecting **All services** at the top of the main left-hand navigation menu.
-8. Type in **ΓÇ£Azure Active Directory**ΓÇ¥ in the filter search box and select the **Azure Active Directory** item.
-9. Select **Enterprise Applications** from the Azure Active Directory left-hand navigation menu.
-10. Select **All Applications** to view a list of all your applications.
- - If you do not see the application you want show up here, use the **Filter** control at the top of the **All Applications List** and set the **Show** option to **All Applications.**
-11. Select the application you want the check the deep link for.
-12. Find the label **User Access URL**. Your deep link should match this URL.
+6. Select the application you want the check the deep link for.
+7. Find the label **User Access URL**. Your deep link should match this URL.
## Contact support
Open a support ticket with the following information if available:
## Next steps -- [Quickstart Series on Application Management](view-applications-portal.md)
+- [Quickstart Series on Application Management](view-applications-portal.md)
active-directory Services Azure Active Directory Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/services-azure-active-directory-support.md
description: List of services that support Azure AD authentication
Previously updated : 01/10/2022 Last updated : 02/01/2022
The following services support Azure AD authentication. New services are added t
| Azure Databricks | [Authenticate using Azure Active Directory tokens](/azure/databricks/dev-tools/api/latest/aad/) | Azure Data Explorer | [How-To Authenticate with Azure Active Directory for Azure Data Explorer Access](/azure/data-explorer/kusto/management/access-control/how-to-authenticate-with-aad) | | Azure Data Lake Storage Gen1 | [Authentication with Azure Data Lake Storage Gen1 using Azure Active Directory](../../data-lake-store/data-lakes-store-authentication-using-azure-active-directory.md) |
+| Azure Database for PostgreSQL | [Use Azure Active Directory for authentication with PostgreSQL](../../postgresql/howto-configure-sign-in-aad-authentication.md)
| Azure Digital Twins | [Set up an Azure Digital Twins instance and authentication (portal)](../../digital-twins/how-to-set-up-instance-portal.md#set-up-user-access-permissions) | | Azure Event Hubs | [Authenticate an application with Azure Active Directory to access Event Hubs resources](../../event-hubs/authenticate-application.md) | Azure IoT Hub | [Control access to IoT Hub](../../iot-hub/iot-hub-devguide-security.md) |
aks Concepts Scale https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/concepts-scale.md
To get started with manually scaling pods and nodes see [Scale applications in A
## Horizontal pod autoscaler
-Kubernetes uses the horizontal pod autoscaler (HPA) to monitor the resource demand and automatically scale the number of replicas. By default, the horizontal pod autoscaler checks the Metrics API every 30 seconds for any required changes in replica count. When changes are required, the number of replicas is increased or decreased accordingly. Horizontal pod autoscaler works with AKS clusters that have deployed the Metrics Server for Kubernetes 1.8+.
+Kubernetes uses the horizontal pod autoscaler (HPA) to monitor the resource demand and automatically scale the number of replicas. By default, the horizontal pod autoscaler checks the Metrics API every 60 seconds for any required changes in replica count. When changes are required, the number of replicas is increased or decreased accordingly. Horizontal pod autoscaler works with AKS clusters that have deployed the Metrics Server for Kubernetes 1.8+.
![Kubernetes horizontal pod autoscaling](media/concepts-scale/horizontal-pod-autoscaling.png)
aks Open Service Mesh About https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/open-service-mesh-about.md
OSM can be used to help your AKS deployments in many different ways. For example
- Configure weighted traffic controls between two or more services for A/B testing or canary deployments. - Collect and view KPIs from application traffic.
+## Add-on limitations
+
+The OSM AKS add-on has the following limitations:
+
+* [Iptables redirection][ip-tables-redirection] for port IP address and port range exclusion must be enabled using `kubectl patch` after installation. For more details, see [iptables redirection][ip-tables-redirection].
+* Pods that are onboarded to the mesh that need access to IMDS, Azure DNS, or the Kubernetes API server must have their IP addresses to the global list of excluded outbound IP ranges using [Global outbound IP range exclusions][global-exclusion].
[osm-azure-cli]: open-service-mesh-deploy-addon-az-cli.md
-[osm-bicep]: open-service-mesh-deploy-addon-bicep.md
+[osm-bicep]: open-service-mesh-deploy-addon-bicep.md
+[ip-tables-redirection]: https://docs.openservicemesh.io/docs/guides/traffic_management/iptables_redirection/
+[global-exclusion]: https://docs.openservicemesh.io/docs/guides/traffic_management/iptables_redirection/#global-outbound-ip-range-exclusions
aks Open Service Mesh Ip Port Exclusion https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/open-service-mesh-ip-port-exclusion.md
- Title: IP and port range exclusion
-description: Implement IP and port range exclusion
-- Previously updated : 10/8/2021---
-# Implement IP and <span class="x x-first x-last">port range exclusion</span>
-
-Outbound TCP based traffic from applications is by default intercepted using the `iptables` rules programmed by OSM, and redirected to the Envoy proxy sidecar. OSM provides a means to specify a list of IP ranges and ports to exclude from traffic interception if necessary. For guidance on how to exclude IP and port ranges, refer to [this documentation](https://docs.openservicemesh.io/docs/guides/traffic_management/iptables_redirection/).
-
-> [!NOTE]
->
-> - For the Open Service Mesh AKS add-on, **port exclusion can only be implemented after installation using `kubectl patch` and not during installation using the OSM CLI `--set` flag.**
-> - If the application pods that are a part of the mesh need access to IMDS, Azure DNS or the Kubernetes API server, the user needs to explicitly add these IP addresses to the list of Global outbound IP ranges using the above command. See an example of Kubernetes API Server port exclusion [here](https://docs.openservicemesh.io/docs/guides/app_onboarding/#onboard-services).
aks Spot Node Pool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/spot-node-pool.md
description: Learn how to add a spot node pool to an Azure Kubernetes Service (A
Previously updated : 10/19/2020 Last updated : 01/21/2022 #Customer intent: As a cluster operator or developer, I want to learn how to add a spot node pool to an AKS Cluster.
The following limitations apply when you create and manage AKS clusters with a s
* You cannot change ScaleSetPriority or SpotMaxPrice after creation. * When setting SpotMaxPrice, the value must be -1 or a positive value with up to five decimal places. * A spot node pool will have the label *kubernetes.azure.com/scalesetpriority:spot*, the taint *kubernetes.azure.com/scalesetpriority=spot:NoSchedule*, and system pods will have anti-affinity.
-* You must add a [corresponding toleration][spot-toleration] to schedule workloads on a spot node pool.
+* You must add a [corresponding toleration][spot-toleration] and affinity to schedule workloads on a spot node pool.
## Add a spot node pool to an AKS cluster
az aks nodepool show --resource-group myResourceGroup --cluster-name myAKSCluste
Confirm *scaleSetPriority* is *Spot*.
-To schedule a pod to run on a spot node, add a toleration that corresponds to the taint applied to your spot node. The following example shows a portion of a yaml file that defines a toleration that corresponds to a *kubernetes.azure.com/scalesetpriority=spot:NoSchedule* taint used in the previous step.
+To schedule a pod to run on a spot node, add a toleration and node affinity that corresponds to the taint applied to your spot node. The following example shows a portion of a yaml file that defines a toleration that corresponds to the *kubernetes.azure.com/scalesetpriority=spot:NoSchedule* taint and a node affinity that corresponds to the *kubernetes.azure.com/scalesetpriority=spot* label used in the previous step.
```yaml spec:
spec:
operator: "Equal" value: "spot" effect: "NoSchedule"
+ affinity:
+ nodeAffinity:
+ requiredDuringSchedulingIgnoredDuringExecution:
+ nodeSelectorTerms:
+ - matchExpressions:
+ - key: "kubernetes.azure.com/scalesetpriority"
+ operator: In
+ values:
+ - "spot"
... ```
-When a pod with this toleration is deployed, Kubernetes can successfully schedule the pod on the nodes with the taint applied.
+When a pod with this toleration and node affinity is deployed, Kubernetes will successfully schedule the pod on the nodes with the taint and label applied.
## Max price for a spot pool [Pricing for spot instances is variable][pricing-spot], based on region and SKU. For more information, see pricing for [Linux][pricing-linux] and [Windows][pricing-windows].
In this article, you learned how to add a spot node pool to an AKS cluster. For
[aks-support-policies]: support-policies.md [aks-faq]: faq.md [azure-cli-install]: /cli/azure/install-azure-cli
-[az-aks-nodepool-add]: /cli/azure/aks/nodepool#az_aks_nodepool_add
+[az-aks-nodepool-add]: /cli/azure/aks/nodepool#az-aks-nodepool-add
[cluster-autoscaler]: cluster-autoscaler.md [eviction-policy]: ../virtual-machine-scale-sets/use-spot.md#eviction-policy [kubernetes-concepts]: concepts-clusters-workloads.md
api-management Api Management Get Started Publish Versions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/api-management-get-started-publish-versions.md Binary files differ
api-management Quickstart Arm Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/quickstart-arm-template.md
tags: azure-resource-manager -+ Last updated 10/09/2020
app-service Configure Ssl Certificate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-ssl-certificate.md
The free App Service managed certificate is a turn-key solution for securing you
The free certificate comes with the following limitations: - Does not support wildcard certificates.-- Does not support usage as a client certificate by certificate thumbprint (removal of certificate thumbprint is planned).
+- Does not support usage as a client certificate by using certificate thumbprint (removal of certificate thumbprint is planned).
- Does not support private DNS. - Is not exportable. - Is not supported on App Service Environment (ASE).
The free certificate comes with the following limitations:
# [Apex domain](#tab/apex) - Must have an A record pointing to your web app's IP address.
+- Is not supported on apps that are not publicly accessible.
- Is not supported with root domains that are integrated with Traffic Manager.-- All the above must be met for successful certificate issuances and renewals
+- All the above must be met for successful certificate issuances and renewals.
# [Subdomain](#tab/subdomain)-- Must have CNAME mapped _directly_ to \<app-name\>.azurewebsites.net; using services that proxy the CNAME value will block certificate issuance and renewal-- All the above must be met for successful certificate issuance and renewals
+- Must have CNAME mapped _directly_ to `<app-name>.azurewebsites.net`. Mapping to an intermediate CNAME value will block certificate issuance and renewal.
+- All the above must be met for successful certificate issuance and renewals.
--
Click **Rekey** to start the process. This process can take 1-10 minutes to comp
Rekeying your certificate rolls the certificate with a new certificate issued from the certificate authority.
-You may be required to [re-verify domain ownership](#verify-domain-ownership).
+You may be required to [reverify domain ownership](#verify-domain-ownership).
Once the rekey operation is complete, click **Sync**. The sync operation automatically updates the hostname bindings for the certificate in App Service without causing any downtime to your apps.
Once the rekey operation is complete, click **Sync**. The sync operation automat
Because an App Service Certificate is a [Key Vault secret](../key-vault/general/about-keys-secrets-certificates.md), you can export a PFX copy of it and use it for other Azure services or outside of Azure.
+> [!NOTE]
+> The exported certificate is an unmanaged artifact. For example, it isn't synced when the App Service Certificate is [renewed](#renew-an-app-service-certificate). You must export the renewed certificate and install it where you need it.
+ To export the App Service Certificate as a PFX file, run the following commands in the [Cloud Shell](https://shell.azure.com). You can also run it locally if you [installed Azure CLI](/cli/azure/install-azure-cli). Replace the placeholders with the names you used when you [created the App Service certificate](#start-certificate-order). ```azurecli-interactive
The downloaded *appservicecertificate.pfx* file is a raw PKCS12 file that contai
### Delete certificate
-Deletion of an App Service certificate is final and irreversible. Deletion of a App Service Certificate resource results in the certificate being revoked. Any binding in App Service with this certificate becomes invalid. To prevent accidental deletion, Azure puts a lock on the certificate. To delete an App Service certificate, you must first remove the delete lock on the certificate.
+Deletion of an App Service certificate is final and irreversible. Deletion of an App Service Certificate resource results in the certificate being revoked. Any binding in App Service with this certificate becomes invalid. To prevent accidental deletion, Azure puts a lock on the certificate. To delete an App Service certificate, you must first remove the delete lock on the certificate.
Select the certificate in the [App Service Certificates](https://portal.azure.com/#blade/HubsExtension/Resources/resourceType/Microsoft.CertificateRegistration%2FcertificateOrders) page, then select **Locks** in the left navigation.
app-service How To Migrate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/how-to-migrate.md
Title: How to migrate App Service Environment v2 to App Service Environment v3
description: Learn how to migrate your App Service Environment v2 to App Service Environment v3 Previously updated : 1/28/2022 Last updated : 2/01/2022 zone_pivot_groups: app-service-cli-portal
Ensure you understand how migrating to an App Service Environment v3 will affect
::: zone pivot="experience-azcli"
-When using the Azure CLI to carry out the migration, you should follow the below steps in order and as written since you'll be making Azure REST API calls. The recommended way for making these calls is by using the [Azure CLI](/cli/azure/). For information about other methods, see [Getting Started with Azure REST](/rest/api/azure/).
+The recommended experience for migration is using the [Azure portal](how-to-migrate.md?pivots=experience-azp). If you decide to use the Azure CLI to carry out the migration, you should follow the below steps in order and as written since you'll be making Azure REST API calls. The recommended way for making these API calls is by using the [Azure CLI](/cli/azure/). For information about other methods, see [Getting Started with Azure REST](/rest/api/azure/).
For this guide, [install the Azure CLI](/cli/azure/install-azure-cli) or use the [Azure Cloud Shell](https://shell.azure.com/).
app-service Overview Arc Integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/overview-arc-integration.md
Title: 'App Service on Azure Arc' description: An introduction to App Service integration with Azure Arc for Azure operators. Previously updated : 12/03/2021 Last updated : 01/31/2022 # App Service, Functions, and Logic Apps on Azure Arc (Preview)
You can run App Service, Functions, and Logic Apps on an Azure Arc-enabled Kuber
> [!NOTE] > To learn how to set up your Kubernetes cluster for App Service, Functions, and Logic Apps, see [Create an App Service Kubernetes environment (Preview)](manage-create-arc-environment.md).
-In most cases, app developers need to know nothing more than how to deploy to the correct Azure region that represents the deployed Kubernetes environment. For operators who provide the environment and maintain the underlying Kubernetes infrastructure, you need to be aware of the following Azure resources:
+In most cases, app developers need to know nothing more than how to deploy to the correct Azure region that represents the deployed Kubernetes environment. For operators who provide the environment and maintain the underlying Kubernetes infrastructure, you must be aware of the following Azure resources:
- The connected cluster, which is an Azure projection of your Kubernetes infrastructure. For more information, see [What is Azure Arc-enabled Kubernetes?](../azure-arc/kubernetes/overview.md).-- A cluster extension, which is a sub-resource of the connected cluster resource. The App Service extension [installs the required pods into your connected cluster](#pods-created-by-the-app-service-extension). For more information about cluster extensions, see [Cluster extensions on Azure Arc-enabled Kubernetes](../azure-arc/kubernetes/conceptual-extensions.md).
+- A cluster extension, which is a subresource of the connected cluster resource. The App Service extension [installs the required pods into your connected cluster](#pods-created-by-the-app-service-extension). For more information about cluster extensions, see [Cluster extensions on Azure Arc-enabled Kubernetes](../azure-arc/kubernetes/conceptual-extensions.md).
- A custom location, which bundles together a group of extensions and maps them to a namespace for created resources. For more information, see [Custom locations on top of Azure Arc-enabled Kubernetes](../azure-arc/kubernetes/conceptual-custom-locations.md).-- An App Service Kubernetes environment, which enables configuration common across apps but not related to cluster operations. Conceptually, it's deployed into the custom location resource, and app developers create apps into this environment. This is described in greater detail in [App Service Kubernetes environment](#app-service-kubernetes-environment).
+- An App Service Kubernetes environment, which enables configuration common across apps but not related to cluster operations. Conceptually, it's deployed into the custom location resource, and app developers create apps into this environment. This resource is described in greater detail in [App Service Kubernetes environment](#app-service-kubernetes-environment).
## Public preview limitations
-The following public preview limitations apply to App Service Kubernetes environments. They will be updated as changes are made available.
+The following public preview limitations apply to App Service Kubernetes environments. This list of limitations is updated as changes and features are made available.
| Limitation | Details | |||
The following public preview limitations apply to App Service Kubernetes environ
| Feature: Key vault references | Not available (depends on managed identities) | | Feature: Pull images from ACR with managed identity | Not available (depends on managed identities) | | Feature: In-portal editing for Functions and Logic Apps | Not available |
+| Feature: Portal listing of Functions or keys | Not available if cluster is not publicly reachable |
| Feature: FTP publishing | Not available | | Logs | Log Analytics must be configured with cluster extension; not per-site | ## Pods created by the App Service extension
-When the App Service extension is installed on the Azure Arc-enabled Kubernetes cluster, you see several pods created in the release namespace that was specified. These pods enable your Kubernetes cluster to be an extension of the `Microsoft.Web` resource provider in Azure and support the management and operation of your apps. Optionally, you can choose to have the extension install [KEDA](https://keda.sh/) for event-driven scaling.
+When the App Service extension is installed on the Azure Arc-enabled Kubernetes cluster, several pods are created in the release namespace that was specified. These pods enable your Kubernetes cluster to be an extension of the `Microsoft.Web` resource provider in Azure and support the management and operation of your apps. Optionally, you can choose to have the extension install [KEDA](https://keda.sh/) for event-driven scaling.
<!-- You can only have one installation of KEDA on the cluster. If you have one already, you must disable this behavior during installation of the cluster extension `TODO`. --> The following table describes the role of each pod that is created by default:
The following table describes the role of each pod that is created by default:
## App Service Kubernetes environment
-The App Service Kubernetes environment resource is required before apps may be created. It enables configuration common to apps in the custom location, such as the default DNS suffix.
+The App Service Kubernetes environment resource is required before apps can be created. It enables configuration common to apps in the custom location, such as the default DNS suffix.
-Only one Kubernetes environment resource may be created in a custom location. In most cases, a developer who creates and deploys apps doesn't need to be directly aware of the resource. It can be directly inferred from the provided custom location ID. However, when defining Azure Resource Manager templates, any plan resource needs to reference the resource ID of the environment directly. The custom location values of the plan and the specified environment must match.
+Only one Kubernetes environment resource can be created in a custom location. In most cases, a developer who creates and deploys apps doesn't need to be directly aware of the resource. It can be directly inferred from the provided custom location ID. However, when defining Azure Resource Manager templates, any plan resource needs to reference the resource ID of the environment directly. The custom location values of the plan and the specified environment must match.
## FAQ for App Service, Functions, and Logic Apps on Azure Arc (Preview)
No. Apps cannot be assigned managed identities when running in Azure Arc. If you
### Are there any scaling limits?
-All applications deployed with Azure App Service on Kubernetes with Azure Arc are able to scale within the limits of the underlying Kubernetes cluster. If the underlying Kubernetes Cluster runs out of available compute resources (CPU and memory primarily), then applications will only be able to scale to the number of instances of the application that Kubernetes can schedule with available resource.
+All applications deployed with Azure App Service on Kubernetes with Azure Arc are able to scale within the limits of the underlying Kubernetes cluster. If the underlying Kubernetes Cluster runs out of available compute resources (CPU and memory primarily), then applications will only be able to scale to the number of instances of the application that Kubernetes can schedule with available resource.
### What logs are collected?
-Logs for both system components and your applications are written to standard output. Both log types can be collected for analysis using standard Kubernetes tools. You can also configure the App Service cluster extension with a [Log Analytics workspace](../azure-monitor/logs/log-analytics-overview.md), and it will send all logs to that workspace.
+Logs for both system components and your applications are written to standard output. Both log types can be collected for analysis using standard Kubernetes tools. You can also configure the App Service cluster extension with a [Log Analytics workspace](../azure-monitor/logs/log-analytics-overview.md), and it sends all logs to that workspace.
-By default, logs from system components are sent to the Azure team. Application logs are not sent. You can prevent these logs from being transferred by setting `logProcessor.enabled=false` as an extension configuration setting. This will also disable forwarding of application to your Log Analytics workspace. Disabling the log processor may impact time needed for any support cases, and you will be asked to collect logs from standard output through some other means.
+By default, logs from system components are sent to the Azure team. Application logs are not sent. You can prevent these logs from being transferred by setting `logProcessor.enabled=false` as an extension configuration setting. This configuration setting will also disable forwarding of application to your Log Analytics workspace. Disabling the log processor might impact time needed for any support cases, and you will be asked to collect logs from standard output through some other means.
### What do I do if I see a provider registration error?
-When creating a Kubernetes environment resource, some subscriptions may see a "No registered resource provider found" error. The error details may include a set of locations and api versions that are considered valid. If this happens, it may be that the subscription needs to be re-registered with the Microsoft.Web provider, an operation which has no impact on existing applications or APIs. To re-register, use the Azure CLI to run `az provider register --namespace Microsoft.Web --wait`. Then re-attempt the Kubernetes environment command.
+When creating a Kubernetes environment resource, some subscriptions might see a "No registered resource provider found" error. The error details might include a set of locations and api versions that are considered valid. If this error message is returned, the subscription must be re-registered with the Microsoft.Web provider, an operation that has no impact on existing applications or APIs. To re-register, use the Azure CLI to run `az provider register --namespace Microsoft.Web --wait`. Then reattempt the Kubernetes environment command.
### Can I deploy the Application services extension on an ARM64 based cluster?
ARM64 based clusters are not supported at this time.
- Initial public preview release of Application services extension. - Support for code and container-based deployments of Web, Function, and Logic Applications.-- Web application runtime support - .NET 3.1 and 5.0; Node JS 12 and 14; Python 3.6, 3.7, and 3.8; PHP 7.3 and 7.4; Ruby 2.5, 2.5.5, 2.6, and 2.6.2; Java SE 8u232, 8u242, 8u252, 11.05, 11.06 and 11.07; Tomcat 8.5, 8.5.41, 8.5.53, 8.5.57, 9.0, 9.0.20, 9.0.33, and 9.0.37.
+- Web application runtime support .NET 3.1 and 5.0; Node JS 12 and 14; Python 3.6, 3.7, and 3.8; PHP 7.3 and 7.4; Ruby 2.5, 2.5.5, 2.6, and 2.6.2; Java SE 8u232, 8u242, 8u252, 11.05, 11.06 and 11.07; Tomcat 8.5, 8.5.41, 8.5.53, 8.5.57, 9.0, 9.0.20, 9.0.33, and 9.0.37.
### Application services extension v 0.10.0 (November 2021)
ARM64 based clusters are not supported at this time.
- Upgrade Azure Function runtime to v3.3.1 - Set default replica count of App Controller and Envoy Controller to 2 to add further stability
-If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension will upgrade automatically. To manually upgrade the extension to the latest version, you can run the command below:
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
```azurecli-interactive az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.10.0
If your extension was in the stable version and auto-upgrade-minor-version is se
- Added Application Insights support for Java and .NET Web Applications - Added support for .NET 6.0 Web Applications - Removed .NET Core 2.0-- Resolved issues with slot swap operations failing-- Resolved issues during Ruby app creation
+- Resolved issues that caused slot swap operations to fail
+- Resolved issues customers experienced during creation of Ruby web applications
-If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension will upgrade automatically. To manually upgrade the extension to the latest version, you can run the command below:
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
```azurecli-interactive az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.11.0 ```
+### Application services extension v 0.11.1 (December 2021)
+
+- Minor release to resolve issue with CRD update
+
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
+
+```azurecli-interactive
+ az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.11.1
+```
+
+### Application services extension v 0.12.0 (January 2022)
+
+- Support for outbound proxy
+- Support for parallel builds in build service
+- Upgrade Envoy to 1.20.1
+- Resolved issue with Application Insights support for .NET Applications
+
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
+
+```azurecli-interactive
+ az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.12.0
+```
+ ## Next steps [Create an App Service Kubernetes environment (Preview)](manage-create-arc-environment.md)
app-service Tutorial Connect Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/tutorial-connect-overview.md
+
+ Title: 'Securely connect to Azure resources'
+description: Your app service may need to connect to other Azure services such as a database, storage, or another app. This overview recommends the more secure method for connecting.
++ Last updated : 01/26/2022+
+# Securely connect to Azure services and databases from Azure App Service
+
+Your app service may need to connect to other Azure services such as a database, storage, or another app. This overview recommends the more secure method for connecting.
+
+|Connection method|When to use|
+|--|--|
+|[Direct connection from App Service managed identity](#connect-to-azure-services-with-managed-identity)|Dependent service [supports managed identity](/azure/active-directory/managed-identities-azure-resources/managed-identities-status)<br><br>* Best for enterprise-level security<br>* Connection to dependent service is secured with managed identity<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.|
+|[Connect using Key Vault secrets from App Service managed identity](#connect-to-key-vault-with-managed-identity)|Dependent service doesn't support managed identity<br><br>* Best for enterprise-level security<br>* Connection includes non-Azure services such as GitHub, Twitter, Facebook, Google<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.<br>* Manage connection information with environment variables.|
+|[Connect with app settings](#connect-with-app-settings)|* Best for small team or individual owner of Azure resources.<br>* Stage 1 of multi-stage migration to Azure<br>* Temporary or proof-of-concept applications<br>* Manually manage connection information with environment variables|
+
+## Connect to Azure services with managed identity
+
+Use [managed identity](/azure/active-directory/managed-identities-azure-resources/overview) to authenticate from one Azure resource, such as Azure app service, to another Azure resource whenever possible. This level of authentication lets Azure manage the authentication process, after the required setup is complete. Once the connection is set up, you won't need to manage the connection.
+
+Benefits of managed identity:
+
+* Automated credentials management
+* Many Azure services are included
+* No additional cost
+* No code changes
++
+Learn which [services](/azure/active-directory/managed-identities-azure-resources/managed-identities-status) are supported with managed identity and what [operations you can perform](/azure/active-directory/managed-identities-azure-resources/overview).
+
+### Example managed identity scenario
+
+The following image demonstrates the following an App Service connecting to other Azure
+
+* A: User visits Azure app service website.
+* B: Securely **connect from** App Service **to** another Azure service using managed identity.
+* C: Securely **connect from** App Service **to** Microsoft Graph.
++
+## Connect to Key Vault with managed identity
+
+When managed identity isn't supported for your app's dependent services, use Key Vault to store your secrets, and connect your app to Key Vault with a managed identity.
+
+Secrets include:
+
+|Secret|Example|
+|--|--|
+|Certificates|SSL certificates|
+|Keys and access tokens|Cognitive service API Key<br>GitHub personal access token<br>Twitter consumer keys and authentication tokens|
+|Connection strings|Database connection strings such as SQL server or MongoDB|
++
+Benefits of managed identity integrated with Key Vault include:
+
+* Connectivity to Key Vault is secured by managed identities
+* Access to the Key Vault is restricted to the app. App contributors, such as administrators, may have complete control of the App Service resources, and at the same time have no access to the Key Vault secrets.
+* No code change is required if your application code already accesses connection secrets with app settings.
+* Monitoring and auditing of who accessed secrets.
+* Rotation of connection information in Key Vault requires no changes in App Service.
+
+## Connect with app settings
+
+The App Service provides [App settings](configure-common.md?tabs=portal#configure-app-settings) to store connection strings, API keys, and other environment variables. While App Service does provide encryption for app settings, for enterprise-level security, consider other services to manage these types of secrets that provide additional benefits.
+
+**App settings** best used when:
+
+* Security of connection information is manual and limited to a few people
+* Web app is temporary, proof-of-concept, or in first migration stage to Azure
+
+**App Service** managed identity to another Azure service best when:
+
+* You don't need to manage Azure credentials. Credentials arenΓÇÖt even accessible to you.
+* You can use managed identities to authenticate to any resource that supports Azure Active Directory authentication including your own applications.
+* Managed identities can be used without any additional cost.
+
+**Key Vault** integration from App Service with managed identity best used when:
+
+* Connectivity to Key Vault is secured by managed identities.
+* Access to the Key Vault is restricted to the app. App contributors, such as administrators, may have complete control of the App Service resources, and at the same time have no access to the Key Vault secrets.
+* No code change is required if your application code already accesses connection secrets with app settings.
+* Monitoring and auditing of who accessed secrets.
++
+## Next steps
+
+* Learn how to use App Service managed identity with:
+ * [SQL server](tutorial-connect-msi-sql-database.md?tabs=windowsclient%2Cdotnet)
+ * [Azure storage](scenario-secure-app-access-storage.md?tabs=azure-portal%2Cprogramming-language-csharp)
+ * [Microsoft Graph](scenario-secure-app-access-microsoft-graph-as-app.md?tabs=azure-powershell%2Cprogramming-language-csharp)
app-service Tutorial Nodejs Mongodb App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/tutorial-nodejs-mongodb-app.md
Title: 'Tutorial: Node.js app with MongoDB'
-description: Learn how to get a Node.js app working in Azure, with connection to a MongoDB database in Azure (Cosmos DB). Sails.js and Angular 12 are used in the tutorial.
-
+ Title: Deploy a Node.js web app using MongoDB to Azure
+description: This article shows you have to deploy a Node.js app using Express.js and a MongoDB database to Azure. Azure App Service is used to host the web application and Azure Cosmos DB to host the database using the 100% compatible MongoDB API built into Cosmos DB.
Previously updated : 07/13/2021-
-zone_pivot_groups: app-service-platform-windows-linux
Last updated : 01/31/2022+
+ms.role: developer
+ms.devlang: javascript
+
-# Tutorial: Build a Node.js and MongoDB app in Azure
--
-[Azure App Service](overview.md) provides a highly scalable, self-patching web hosting service. This tutorial shows how to create a Node.js app in App Service on Windows and connect it to a MongoDB database. When you're done, you'll have a MEAN application (MongoDB, Express, AngularJS, and Node.js) running in [Azure App Service](overview.md). The sample application uses a combination of [Sails.js](https://sailsjs.com/) and [Angular 12](https://angular.io/).
----
-[Azure App Service](overview.md) provides a highly scalable, self-patching web hosting service using the Linux operating system. This tutorial shows how to create a Node.js app in App Service on Linux, connect it locally to a MongoDB database, then deploy it to a database in Azure Cosmos DB's API for MongoDB. When you're done, you'll have a MEAN application (MongoDB, Express, AngularJS, and Node.js) running in App Service on Linux. The sample application uses a combination of [Sails.js](https://sailsjs.com/) and [Angular 12](https://angular.io/).
--
-![MEAN app running in Azure App Service](./media/tutorial-nodejs-mongodb-app/run-in-azure.png)
-
-What you'll learn:
-
-> [!div class="checklist"]
-> * Create a MongoDB database in Azure
-> * Connect a Node.js app to MongoDB
-> * Deploy the app to Azure
-> * Update the data model and redeploy the app
-> * Stream diagnostic logs from Azure
-> * Manage the app in the Azure portal
--
-## Prerequisites
-
-To complete this tutorial:
--- [Install Git](https://git-scm.com/)-- [Install Node.js and NPM](https://nodejs.org/)-
-## Create local Node.js app
-
-In this step, you set up the local Node.js project.
+# Deploy a Node.js + MongoDB web app to Azure
-### Clone the sample application
+In this tutorial, you'll deploy a sample **Express.js** app using a **MongoDB** database to Azure. The Express.js app will be hosted in Azure App Service which supports hosting Node.js apps in both Linux (Node versions 12, 14, and 16) and Windows (versions 12 and 14) server environments. The MongoDB database will be hosted in Azure Cosmos DB, a cloud native database offering a [100% MongoDB compatible API](/azure/cosmos-db/mongodb/mongodb-introduction).
-In the terminal window, `cd` to a working directory.
-Run the following command to clone the sample repository.
+This article assumes you are already familiar with [Node.js development](/learn/paths/build-javascript-applications-nodejs/) and have Node and MongoDB installed locally. You'll also need an Azure account with an active subscription. If you do not have an Azure account, you [can create one for free](https://azure.microsoft.com/free/nodejs/).
-```bash
-git clone https://github.com/Azure-Samples/mean-todoapp.git
-```
-
-> [!NOTE]
-> For information on how the sample app is created, see [https://github.com/Azure-Samples/mean-todoapp](https://github.com/Azure-Samples/mean-todoapp).
+## Sample application
-### Run the application
-
-Run the following commands to install the required packages and start the application.
+To follow along with this tutorial, clone or download the sample application from the repository [https://github.com/Azure-Samples/msdocs-nodejs-mongodb-azure-sample-app](https://github.com/Azure-Samples/msdocs-nodejs-mongodb-azure-sample-app).
```bash
-cd mean-todoapp
-npm install
-node app.js --alter
+git clone https://github.com/Azure-Samples/msdocs-nodejs-mongodb-azure-sample-app.git
```
-When the app is fully loaded, you see something similar to the following message:
-
-<pre>
-debug: -
-debug: :: Fri Jul 09 2021 13:10:34 GMT+0200 (Central European Summer Time)
+Follow these steps to run the application locally:
-debug: Environment : development
-debug: Port : 1337
-debug: -
-</pre>
+* Install the package dependencies by running `npm install`
+* Copy the `.env.sample` file to `.env` and populate the DATABASE_URL value with your MongoDB URL (for example *mongodb://localhost:27017/*)
+* Start the application using `npm start`
+* To view the app, browse to `http://localhost:3000`
-Navigate to `http://localhost:1337` in a browser. Add a few todo items.
+## 1 - Create the Azure App Service
-The MEAN sample application stores user data in the database. By default, it uses a disk-based development database. If you can create and see todo items, then your app is reading and writing data.
+Azure App Service is used to host the Express.js web app. When setting up the App Service for the application, you will specify:
-![MEAN app loaded successfully](./media/tutorial-nodejs-mongodb-app/run-locally.png)
+* The **Name** for the web app. This name is used as part of the DNS name for your webapp in the form of `https://<app-name>.azurewebsites.net`.
+* The **Runtime** for the app. This is where you select the version of Node to use for your app.
+* The **App Service plan** which defines the compute resources (CPU, memory) available for the application.
+* The **Resource Group** for the app. A resource group lets you group all of the Azure resources needed for the application together in a logical container.
-To stop Node.js at any time, press `Ctrl+C` in the terminal.
+Azure resources can be created using the [Azure portal](https://portal.azure.com/), VS Code using the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack), or the Azure CLI.
-## Create production MongoDB
+### [Azure portal](#tab/azure-portal)
-In this step, you create a MongoDB database in Azure. When your app is deployed to Azure, it uses this cloud database.
+Sign in to the [Azure portal](https://portal.azure.com/) and follow these steps to create your Azure App Service resources.
-For MongoDB, this tutorial uses [Azure Cosmos DB](../cosmos-db/index.yml). Cosmos DB supports MongoDB client connections.
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create app service step 1](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-1-240px.png" alt-text="A screenshot showing how to use the search box in the top tool bar to find App Services in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-1.png"::: |
+| [!INCLUDE [Create app service step 2](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-2-240px.png" alt-text="A screenshot showing the create button on the App Services page used to create a new web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-2.png"::: |
+| [!INCLUDE [Create app service step 3](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-3-240px.png" alt-text="A screenshot showing the form to fill out to create a web app in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-3.png"::: |
+| [!INCLUDE [Create app service step 4](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-4-240px.png" alt-text="A screenshot of the Spec Picker dialog that allows you to select the App Service plan to use for your web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-4.png"::: |
+| [!INCLUDE [Create app service step 4](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-5.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-5-240px.png" alt-text="A screenshot of the main web app create page showing the button to select on to create your web app in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-5.png"::: |
-### Create a resource group
+### [VS Code](#tab/vscode-aztools)
+To create Azure resources in VS Code, you must have the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack) installed and be signed into Azure from VS Code.
-### Create a Cosmos DB account
+> [!div class="nextstepaction"]
+> [Download Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack)
-> [!NOTE]
-> There is a cost to creating the Azure Cosmos DB databases in this tutorial in your own Azure subscription. To use a free Azure Cosmos DB account for seven days, you can use the [Try Azure Cosmos DB for free](https://azure.microsoft.com/try/cosmosdb/) experience. Just click the **Create** button in the MongoDB tile to create a free MongoDB database on Azure. Once the database is created, navigate to **Connection String** in the portal and retrieve your Azure Cosmos DB connection string for use later in the tutorial.
->
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create app service step 1](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-01.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-01-240px.png" alt-text="A screenshot showing the location of the Azure Tools icon in the left toolbar." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-01.png"::: |
+| [!INCLUDE [Create app service step 2](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-02.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-02-240px.png" alt-text="A screenshot showing the App Service section of Azure Tools showing how to create a new web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-02.png"::: |
+| [!INCLUDE [Create app service step 3](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-03.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-03-240px.png" alt-text="A screenshot showing the dialog box used to enter the name of the web app in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-03.png"::: |
+| [!INCLUDE [Create app service step 4](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-04.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-04-240px.png" alt-text="A screenshot of dialog box used to select a resource group or create a new one for the web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-04.png"::: |
+| [!INCLUDE [Create app service step 5](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-05.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-05-240px.png" alt-text="A screenshot of the dialog box in VS Code used enter a name for the resource group." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-05.png"::: |
+| [!INCLUDE [Create app service step 6](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-06.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-06-240px.png" alt-text="A screenshot of the dialog box in VS Code used to select Node 14 LTS as the runtime for the web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-06.png"::: |
+| [!INCLUDE [Create app service step 7](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-07.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-07-240px.png" alt-text="A screenshot of the dialog in VS Code used to select operating system to use for hosting the web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-07.png"::: |
+| [!INCLUDE [Create app service step 8](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-08.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-08-240px.png" alt-text="A screenshot of the dialog in VS Code used to select location of the web app resources." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-08.png"::: |
+| [!INCLUDE [Create app service step 9](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-09.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-09-240px.png" alt-text="A screenshot of the dialog in VS Code used to select an App Service plan or create a new one." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-09.png"::: |
+| [!INCLUDE [Create app service step 10](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-10.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-10-240px.png" alt-text="A screenshot of the dialog in VS Code used to enter the name of the App Service plan." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-10.png"::: |
+| [!INCLUDE [Create app service step 11](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-11.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-11-240px.png" alt-text="A screenshot of the dialog in VS Code used to select the pricing tier of the App Service plan." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-11.png"::: |
+| [!INCLUDE [Create app service step 12](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-12.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-12-240px.png" alt-text="A screenshot of the dialog in VS Code asking if you want to create an App Insights resource for your web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-12.png"::: |
-In the Cloud Shell, create a Cosmos DB account with the [`az cosmosdb create`](/cli/azure/cosmosdb#az_cosmosdb_create) command.
-
-In the following command, substitute a unique Cosmos DB name for the *\<cosmosdb-name>* placeholder. This name is used as the part of the Cosmos DB endpoint, `https://<cosmosdb-name>.documents.azure.com/`, so the name needs to be unique across all Cosmos DB accounts in Azure. The name must contain only lowercase letters, numbers, and the hyphen (-) character, and must be between 3 and 50 characters long.
-
-```azurecli-interactive
-az cosmosdb create --name <cosmosdb-name> --resource-group myResourceGroup --kind MongoDB
-```
-
-The *--kind MongoDB* parameter enables MongoDB client connections.
-
-When the Cosmos DB account is created, the Azure CLI shows information similar to the following example:
-
-<pre>
-{
- "apiProperties": {
- "serverVersion": "3.6"
- },
- "backupPolicy": {
- "periodicModeProperties": {
- "backupIntervalInMinutes": 240,
- "backupRetentionIntervalInHours": 8,
- "backupStorageRedundancy": "Geo"
- },
- "type": "Periodic"
- },
- "capabilities": [
- {
- "name": "EnableMongo"
- }
- ],
- "connectorOffer": null,
- "consistencyPolicy": {
- "defaultConsistencyLevel": "Session",
- "maxIntervalInSeconds": 5,
- "maxStalenessPrefix": 100
- },
- "cors": [],
- "databaseAccountOfferType": "Standard",
- "defaultIdentity": "FirstPartyIdentity",
- "disableKeyBasedMetadataWriteAccess": false,
- "documentEndpoint": "https://&lt;cosmosdb-name&gt;.documents.azure.com:443/",
- ...
- &lt; Output truncated for readability &gt;
-}
-</pre>
-
-## Connect app to production MongoDB
-
-In this step, you connect your sample application to the Cosmos DB database you just created, using a MongoDB connection string.
-
-### Retrieve the database key
-
-To connect to the Cosmos DB database, you need the database key. In the Cloud Shell, use the [`az cosmosdb keys list`](/cli/azure/cosmosdb#az_cosmosdb_keys_list) command to retrieve the primary key.
-
-```azurecli-interactive
-az cosmosdb keys list --name <cosmosdb-name> --resource-group myResourceGroup
-```
+### [Azure CLI](#tab/azure-cli)
-The Azure CLI shows information similar to the following example:
-<pre>
-{
- "primaryMasterKey": "RS4CmUwzGRASJPMoc0kiEvdnKmxyRILC9BWisAYh3Hq4zBYKr0XQiSE4pqx3UchBeO4QRCzUt1i7w0rOkitoJw==",
- "primaryReadonlyMasterKey": "HvitsjIYz8TwRmIuPEUAALRwqgKOzJUjW22wPL2U8zoMVhGvregBkBk9LdMTxqBgDETSq7obbwZtdeFY7hElTg==",
- "secondaryMasterKey": "Lu9aeZTiXU4PjuuyGBbvS1N9IRG3oegIrIh95U6VOstf9bJiiIpw3IfwSUgQWSEYM3VeEyrhHJ4rn3Ci0vuFqA==",
- "secondaryReadonlyMasterKey": "LpsCicpVZqHRy7qbMgrzbRKjbYCwCKPQRl0QpgReAOxMcggTvxJFA94fTi0oQ7xtxpftTJcXkjTirQ0pT7QFrQ=="
-}
-</pre>
-
-Copy the value of `primaryMasterKey`. You need this information in the next step.
-
-<a name="devconfig"></a>
-### Configure the connection string in your sample application
-
-In your local repository, in _config/datastores.js_, replace the existing content with the following code and save your changes.
-
-```javascript
-module.exports.datastores = {
- default: {
- adapter: 'sails-mongo',
- url: process.env.MONGODB_URI,
- ssl: true,
- },
-};
-```
-
-The `ssl: true` option is required because [Cosmos DB requires TLS/SSL](../cosmos-db/connect-mongodb-account.md#connection-string-requirements). `url` is set to an environment variable, which you will set next.
-
-In the terminal, set the `MONGODB_URI` environment variable. Be sure to replace the two \<cosmosdb-name> placeholders with your Cosmos DB database name, and replace the \<cosmosdb-key> placeholder with the key you copied in the previous step.
-
-```bash
-export MONGODB_URI=mongodb://<cosmosdb-name>:<cosmosdb-key>@<cosmosdb-name>.documents.azure.com:10250/todoapp
-```
-
-> [!NOTE]
-> This connection string follows the format defined in the [Sails.js documentation](https://sailsjs.com/documentation/reference/configuration/sails-config-datastores#?the-connection-url).
-
-### Test the application with MongoDB
-
-In a local terminal window, run `node app.js --alter` again.
-
-```bash
-node app.js --alter
-```
-
-Navigate to `http://localhost:1337` again. If you can create and see todo items, then your app is reading and writing data using the Cosmos DB database in Azure.
-
-In the terminal, stop Node.js by typing `Ctrl+C`.
-
-## Deploy app to Azure
-
-In this step, you deploy your MongoDB-connected Node.js application to Azure App Service.
+
-### Configure a deployment user
+## 2 - Create an Azure Cosmos DB in MongoDB compatibility mode
+Azure Cosmos DB is a fully managed NoSQL database for modern app development. Among its features is a 100% MongoDB compatible API allowing you to use your existing MongoDB tools, packages, and applications with Cosmos DB.
-### Create an App Service plan
+### [Azure portal](#tab/azure-portal)
+You must be signed in to the [Azure portal](https://portal.azure.com/) to complete these steps to create a Cosmos DB.
-In the Cloud Shell, create an App Service plan with the [`az appservice plan create`](/cli/azure/appservice/plan) command.
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create Cosmos DB step 1](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-1-240px.png" alt-text="A screenshot showing how to use the search box in the top tool bar to find Cosmos DB in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-1.png"::: |
+| [!INCLUDE [Create Cosmos DB step 2](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-2-240px.png" alt-text="A screenshot showing the create button on the Cosmos DB page used to create a database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-2.png"::: |
+| [!INCLUDE [Create Cosmos DB step 3](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-3-240px.png" alt-text="A screenshot showing the page where you select the MongoDB API for your Cosmos DB." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-3.png"::: |
+| [!INCLUDE [Create Cosmos DB step 4](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-4-240px.png" alt-text="A screenshot showing how to fill out the page to create a new Cosmos DB." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-4.png"::: |
-The following example creates an App Service plan named `myAppServicePlan` in the **B1** pricing tier:
+### [VS Code](#tab/vscode-aztools)
-```azurecli-interactive
-az appservice plan create --name myAppServicePlan --resource-group myResourceGroup --sku B1
-```
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create Cosmos DB step 1](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-1-240px.png" alt-text="A screenshot showing the databases component of the Azure Tools VS Code extension and the location of the button to create a new database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-1.png"::: |
+| [!INCLUDE [Create Cosmos DB step 2](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-2-240px.png" alt-text="A screenshot showing the dialog box used to select the subscription for the new database in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-2.png"::: |
+| [!INCLUDE [Create Cosmos DB step 3](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-3-240px.png" alt-text="A screenshot showing the dialog box used to select the type of database you want to create in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-3.png"::: |
+| [!INCLUDE [Create Cosmos DB step 4](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-4-240px.png" alt-text="A screenshot of dialog box used to enter the name of the new database in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-4.png"::: |
+| [!INCLUDE [Create Cosmos DB step 5](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-5.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-5-240px.png" alt-text="A screenshot of the dialog to select the throughput mode of the database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-5.png"::: |
+| [!INCLUDE [Create Cosmos DB step 6](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-6.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-6-240px.png" alt-text="A screenshot of the dialog in VS Code used to select resource group to put the new database in." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-6.png"::: |
+| [!INCLUDE [Create Cosmos DB step 7](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-7.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-7-240px.png" alt-text="A screenshot of the dialog in VS Code used to select location for the new database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-7.png"::: |
-When the App Service plan has been created, the Azure CLI shows information similar to the following example:
+### [Azure CLI](#tab/azure-cli)
-<pre>
-{
- "freeOfferExpirationTime": null,
- "geoRegion": "UK West",
- "hostingEnvironmentProfile": null,
- "hyperV": false,
- "id": "/subscriptions/0000-0000/resourceGroups/myResourceGroup/providers/Microsoft.Web/serverfarms/myAppServicePlan",
- "isSpot": false,
- "isXenon": false,
- "kind": "app",
- "location": "ukwest",
- "maximumElasticWorkerCount": 1,
- "maximumNumberOfWorkers": 0,
- &lt; JSON data removed for brevity. &gt;
-}
-</pre>
+
+## 3 - Connect your App Service to your Cosmos DB
-In the Cloud Shell, create an App Service plan with the [`az appservice plan create`](/cli/azure/appservice/plan) command.
+To connect to your Cosmos DB database, you need to provide the connection string for the database to your application. This is done in the sample application by reading the `DATABASE_URL` environment variable. When running locally, the sample application uses the [dotenv package](https://www.npmjs.com/package/dotenv) to read the connection string value from the `.env` file.
-<!-- [!INCLUDE [app-service-plan](app-service-plan.md)] -->
+When running in Azure, configuration values like connection strings can be stored in the *application settings* of the App Service hosting the web app. These values are then made available to your application as environment variables during runtime. In this way, the application accesses the connection string from `process.env` the same way whether being run locally or in Azure. Further, this eliminates the need to manage and deploy environment specific config files with your application.
-The following example creates an App Service plan named `myAppServicePlan` in the **B1** pricing tier:
+### [Azure portal](#tab/azure-portal)
-```azurecli-interactive
-az appservice plan create --name myAppServicePlan --resource-group myResourceGroup --sku B1 --is-linux
-```
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Connection string step 1](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-1-240px.png" alt-text="A screenshot showing the location of the Cosmos DB connection string on the Cosmos DB quick start page." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-1.png"::: |
+| [!INCLUDE [Connection string step 2](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-2-240px.png" alt-text="A screenshot showing how to search for and navigate to the App Service where the connection string needs to store the connection string." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-2.png"::: |
+| [!INCLUDE [Connection string step 3](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-3-240px.png" alt-text="A screenshot showing how to access the Application settings within an App Service." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-3.png"::: |
+| [!INCLUDE [Connection string step 4](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-4-240px.png" alt-text="A screenshot showing the dialog used to set an application setting in Azure App Service." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-4.png"::: |
-When the App Service plan has been created, the Azure CLI shows information similar to the following example:
+### [VS Code](#tab/vscode-aztools)
-<pre>
-{
- "freeOfferExpirationTime": null,
- "geoRegion": "West Europe",
- "hostingEnvironmentProfile": null,
- "id": "/subscriptions/0000-0000/resourceGroups/myResourceGroup/providers/Microsoft.Web/serverfarms/myAppServicePlan",
- "kind": "linux",
- "location": "West Europe",
- "maximumNumberOfWorkers": 1,
- "name": "myAppServicePlan",
- &lt; JSON data removed for brevity. &gt;
- "targetWorkerSizeId": 0,
- "type": "Microsoft.Web/serverfarms",
- "workerTierName": null
-}
-</pre>
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Connection string step 1](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-1-240px.png" alt-text="A screenshot showing how to copy the connection string for a Cosmos database to your clipboard in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-1.png"::: |
+| [!INCLUDE [Connection string step 2](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-2-240px.png" alt-text="A screenshot showing how to add a config setting to an App Service in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-2.png"::: |
+| [!INCLUDE [Connection string step 3](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-3-240px.png" alt-text="A screenshot showing the dialog box used to give a name to an app setting in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-3.png"::: |
+| [!INCLUDE [Connection string step 4](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-4-240px.png" alt-text="A screenshot showing the dialog used to set the value of an app setting in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-4.png"::: |
+| [!INCLUDE [Connection string step 4](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-5.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-5-240px.png" alt-text="A screenshot showing how to view an app setting for an App Service in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-5.png"::: |
+### [Azure CLI](#tab/azure-cli)
-<a name="create"></a>
-### Create a web app
-+
+## 4 - Deploy application code to Azure
+Azure App service supports multiple methods to deploy your application code to Azure including support for GitHub Actions and all major CI/CD tools. This article focuses on how to deploy your code from your local workstation to Azure.
+### [Deploy using VS Code](#tab/vscode-deploy)
+To deploy your application code directly from VS Code, you must have the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack) installed and be signed into Azure from VS Code.
-### Configure an environment variable
+> [!div class="nextstepaction"]
+> [Download Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack)
-Remember that the sample application is already configured to use the `MONGODB_URI` environment variable in `config/datastores.js`. In App Service, you inject this variable by using an [app setting](configure-common.md#configure-app-settings).
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Deploy from VS Code 1](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-1-240px.png" alt-text="A screenshot showing the location of the Azure Tool icon in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-1.png"::: |
+| [!INCLUDE [Deploy from VS Code 2](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-2-240px.png" alt-text="A screenshot showing how you deploy an application to Azure by right-clicking on a web app in VS Code and selecting deploy from the context menu." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-2.png"::: |
+| [!INCLUDE [Deploy from VS Code 3](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-3-240px.png" alt-text="A screenshot showing the dialog box used to select the deployment directory in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-3.png"::: |
+| [!INCLUDE [Deploy from VS Code 3](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-4-240px.png" alt-text="A screenshot showing the Output window of VS Code while deploying an application to Azure." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-4.png"::: |
-To set app settings, use the [`az webapp config appsettings set`](/cli/azure/webapp/config/appsettings#az_webapp_config_appsettings_set) command in the Cloud Shell.
-The following example configures a `MONGODB_URI` app setting in your Azure app. Replace the *\<app-name>*, *\<cosmosdb-name>*, and *\<cosmosdb-key>* placeholders.
+### [Deploy using Local Git](#tab/local-git-deploy)
-```azurecli-interactive
-az webapp config appsettings set --name <app-name> --resource-group myResourceGroup --settings MONGODB_URI='mongodb://<cosmosdb-name>:<cosmosdb-key>@<cosmosdb-name>.documents.azure.com:10250/todoapp' DEPLOYMENT_BRANCH='main'
-```
-> [!NOTE]
-> `DEPLOYMENT_BRANCH` is a special app setting that tells the deployment engine which Git branch you're deploying to in App Service.
-
-### Push to Azure from Git
---
-<pre>
-Enumerating objects: 5, done.
-Counting objects: 100% (5/5), done.
-Delta compression using up to 8 threads
-Compressing objects: 100% (3/3), done.
-Writing objects: 100% (3/3), 318 bytes | 318.00 KiB/s, done.
-Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
-remote: Updating branch 'main'.
-remote: Updating submodules.
-remote: Preparing deployment for commit id '4eb0ca7190'.
-remote: Generating deployment script.
-remote: Running deployment command...
-remote: Handling node.js deployment.
-remote: Creating app_offline.htm
-remote: KuduSync.NET from: 'D:\home\site\repository' to: 'D:\home\site\wwwroot'
-remote: Copying file: 'package.json'
-remote: Deleting app_offline.htm
-remote: Looking for app.js/server.js under site root.
-remote: Using start-up script app.js
-remote: Generated web.config.
-.
-.
-.
-remote: Deployment successful.
-To https://&lt;app-name&gt;.scm.azurewebsites.net/&lt;app-name&gt;.git
- * [new branch]      main -> main
-</pre>
-
-> [!TIP]
-> During Git deployment, the deployment engine runs `npm install --production` as part of its build automation.
->
-> - As defined in `package.json`, the `postinstall` script is picked up by `npm install` and runs `ng build` to generate the production files for Angular and deploy them to the [assets](https://sailsjs.com/documentation/concepts/assets) folder.
-> - `scripts` in `package.json` can use tools that are installed in `node_modules/.bin`. Since `npm install` has installed `node_modules/.bin/ng` too, you can use it to deploy your Angular client files. This npm behavior is exactly the same in Azure App Service.
-> Packages under `devDependencies` in `package.json` are not installed. Any package you need in the production environment needs to be moved under `dependencies`.
->
-> If your app needs to bypass the default automation and run custom automation, see [Run Grunt/Bower/Gulp](configure-language-nodejs.md#run-gruntbowergulp).
---
-<pre>
-Enumerating objects: 5, done.
-Counting objects: 100% (5/5), done.
-Delta compression using up to 8 threads
-Compressing objects: 100% (3/3), done.
-Writing objects: 100% (3/3), 347 bytes | 347.00 KiB/s, done.
-Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
-remote: Deploy Async
-remote: Updating branch 'main'.
-remote: Updating submodules.
-remote: Preparing deployment for commit id 'f776be774a'.
-remote: Repository path is /home/site/repository
-remote: Running oryx build...
-remote: Operation performed by Microsoft Oryx, https://github.com/Microsoft/Oryx
-remote: You can report issues at https://github.com/Microsoft/Oryx/issues
-remote:
-remote: Oryx Version: 0.2.20210420.1, Commit: 85c6e9278aae3980b86cb1d520aaad532c814ed7, ReleaseTagName: 20210420.1
-remote:
-remote: Build Operation ID: |qwejn9R4StI=.5e8a3529_
-remote: Repository Commit : f776be774a3ea8abc48e5ee2b5132c037a636f73
-.
-.
-.
-remote: Deployment successful.
-remote: Deployment Logs : 'https://&lt;app-name&gt;.scm.azurewebsites.net/newui/jsonviewer?view_url=/api/deployments/a6fcf811136739f145e0de3be82ff195bca7a68b/log'
-To https://&lt;app-name&gt;.scm.azurewebsites.net/&lt;app-name&gt;.git
- 4f7e3ac..a6fcf81 main -> main
-</pre>
-
-> [!TIP]
-> During Git deployment, the deployment engine runs `npm install` as part of its build automation.
->
-> - As defined in `package.json`, the `postinstall` script is picked up by `npm install` and runs `ng build` to generate the production files for Angular and deploy them to the [assets](https://sailsjs.com/documentation/concepts/assets) folder.
-> - `scripts` in `package.json` can use tools that are installed in `node_modules/.bin`. Since `npm install` has installed `node_modules/.bin/ng` too, you can use it to deploy your Angular client files. This npm behavior is exactly the same in Azure App Service.
-> When build automation is complete, the whole completed repository is copied into the `/home/site/wwwroot` folder, out of which your app is hosted.
->
-> If your app needs to bypass the default automation and run custom automation, see [Run Grunt/Bower/Gulp](configure-language-nodejs.md#run-gruntbowergulp).
--
-### Browse to the Azure app
-
-Browse to the deployed app using your web browser.
-
-```bash
-https://<app-name>.azurewebsites.net
-```
-
-If you can create and see todo items in the browser, then your sample app in Azure has connectivity to the MongoDB (Cosmos DB) database.
-
-![MEAN app running in Azure App Service](./media/tutorial-nodejs-mongodb-app/run-in-azure.png)
-
-**Congratulations!** You're running a data-driven Node.js app in Azure App Service.
-
-## Update data model and redeploy
-
-In this step, you change the `Todo` data model and publish your change to Azure.
-
-### Update the server-side model
-
-In Sails.js, changing the server-side model and API code is as simple as changing the data model, because [Sails.js already defines the common routes](https://sailsjs.com/documentation/concepts/blueprints/blueprint-routes#?restful-routes) for a model by default.
-
-In your local repository, open _api/models/Todo.js_ and add a `done` attribute. When you're done, your schema code should look like this:
-
-```javascript
-module.exports = {
-
- attributes: {
- value: {type: 'string'},
- done: {type: 'boolean', defaultsTo: false}
- },
-
-};
-```
+### [Deploy using a ZIP file](#tab/azure-cli-deploy)
-### Update the client code
-There are three files you need to modify: the client model, the HTML template, and the component file.
+
-Open _client/src/app/todo.ts_ and add a `done` property. When you're done, your model show look like this:
+## 5 - Browse to the application
-```typescript
-export class Todo {
- id!: String;
- value!: String;
- done!: Boolean;
-}
-```
+The application will have a url of the form `https://<app name>.azurewebsites.net`. Browse to this URL to view the application.
-Open _client/src/app/app.component.html_. Just above the only `<span>` element, add the following code to add a checkbox at the beginning of each todo item:
+Use the form elements in the application to add and complete tasks.
-```html
-<input class="form-check-input me-2" type="checkbox" [checked]="todo.done" (click)="toggleDone(todo.id, i)" [disabled]="isProcessing">
-```
+![A screenshot showing the application running in a browser.](./media/tutorial-nodejs-mongodb-app/sample-app-in-browser.png)
-Open _client/src/app/app.component.ts_. Just above the last closing curly brace (`}`), insert the following method. It's called by the template code above when the checkbox is clicked and updates the server-side data.
-
-```typescript
-toggleDone(id:any, i:any) {
- console.log("Toggled checkbox for " + id);
- this.isProcessing = true;
- this.Todos[i].done = !this.Todos[i].done;
- this.restService.updateTodo(id, this.Todos[i])
- .subscribe((res) => {
- console.log('Data updated successfully!');
- this.isProcessing = false;
- }, (err) => {
- console.log(err);
- this.Todos[i].done = !this.Todos[i].done;
- });
-}
-```
+## 6 - Configure and view application logs
-### Test your changes locally
+Azure App Service captures all messages logged to the console to assist you in diagnosing issues with your application. The sample app outputs console log messages in each of its endpoints to demonstrate this capability. For example, the `get` endpoint outputs a message about the number of tasks retrieved from the database and an error message if something goes wrong.
-In the local terminal window, compile the updated Angular client code with the build script defined in `package.json`.
-```bash
-npm run build
-```
-
-Test your changes with `node app.js --alter` again. Since you changed your server-side model, the `--alter` flag lets `Sails.js` alter the data structure in your Cosmos DB database.
-
-```bash
-node app.js --alter
-```
+The contents of the App Service diagnostic logs can be reviewed in the Azure portal, VS Code, or using the Azure CLI.
-Navigate to `http://localhost:1337`. You should now see a checkbox in front of todo item. When you select or clear a checkbox, the Cosmos DB database in Azure is updated to indicate that the todo item is done.
+### [Azure portal](#tab/azure-portal)
-![Added Done data and UI](./media/tutorial-nodejs-mongodb-app/added-done.png)
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Stream logs from Azure portal 1](<./includes/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-1-240px.png" alt-text="A screenshot showing the location of the Azure Tool icon in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-1.png"::: |
+| [!INCLUDE [Stream logs from Azure portal 2](<./includes/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-2-240px.png" alt-text="A screenshot showing how you deploy an application to Azure by right-clicking on a web app in VS Code and selecting deploy from the context menu." lightbox="./media/tutorial-nodejs-mongodb-app//stream-logs-azure-portal-2.png"::: |
-In the terminal, stop Node.js by typing `Ctrl+C`.
+### [VS Code](#tab/vscode-aztools)
-### Publish changes to Azure
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Stream logs from VS Code 1](<./includes/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-1-240px.png" alt-text="A screenshot showing the location of the Azure Tool icon in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-1.png"::: |
+| [!INCLUDE [Stream logs from VS Code 2](<./includes/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-2-240px.png" alt-text="A screenshot showing how you deploy an application to Azure by right-clicking on a web app in VS Code and selecting deploy from the context menu." lightbox="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-2.png"::: |
-In the local terminal window, commit your changes in Git, then push the code changes to Azure.
+### [Azure CLI](#tab/azure-cli)
-```bash
-git commit -am "added done field"
-git push azure main
-```
-Once the `git push` is complete, navigate to your Azure app and try out the new functionality.
+
-![Model and database changes published to Azure](media/tutorial-nodejs-mongodb-app/added-done-published.png)
+## 7 - Inspect deployed files using Kudu
-If you added any articles earlier, you still can see them. Existing data in your Cosmos DB is not lost. Also, your updates to the data schema and leaves your existing data intact.
+Azure App Service provides a web-based diagnostics console named [Kudu](/azure/app-service/resources-kudu) that allows you to examine the server hosting environment for your web app. Using Kudu, you can view the files deployed to Azure, review the deployment history of the application, and even open an SSH session into the hosting environment.
-## Stream diagnostic logs
+To access Kudu, navigate to one of the following URLs. You will need to sign into the Kudu site with your Azure credentials.
+* For apps deployed in Free, Shared, Basic, Standard, and Premium App Service plans - `https://<app-name>.scm.azurewebsites.net`
+* For apps deployed in Isolated service plans - `https://<app-name>.scm.<ase-name>.p.azurewebsites.net`
-While your Node.js application runs in Azure App Service, you can get the console logs piped to your terminal. That way, you can get the same diagnostic messages to help you debug application errors.
+From the main page in Kudu, you can access information about the application hosting environment, app settings, deployments, and browse the files in the wwwroot directory.
-To start log streaming, use the [`az webapp log tail`](/cli/azure/webapp/log#az_webapp_log_tail) command in the Cloud Shell.
+![A screenshot of the main page in the Kudu SCM app showing the different information available about the hosting environment.](./media/tutorial-nodejs-mongodb-app/kudu-main-page.png)
-```azurecli-interactive
-az webapp log tail --name <app-name> --resource-group myResourceGroup
-```
+Selecting the *Deployments* link under the REST API header will show you a history of deployments of your web app.
-Once log streaming has started, refresh your Azure app in the browser to get some web traffic. You now see console logs piped to your terminal.
+![A screenshot of the deployments JSON in the Kudu SCM app showing the history of deployments to this web app.](./media/tutorial-nodejs-mongodb-app/kudu-deployments-list.png)
-Stop log streaming at any time by typing `Ctrl+C`.
+Selecting the *Site wwwroot* link under the Browse Directory heading allows you to browse and view the files on the web server.
+![A screenshot of files in the wwwroot directory showing how Kudu allows you to see what has been deployed to Azure.](./media/tutorial-nodejs-mongodb-app/kudu-wwwroot-files.png)
+## Clean up resources
+When you are finished, you can delete all of the resources from Azure by deleting the resource group for the application.
+### [Azure portal](#tab/azure-portal)
-## Manage your Azure app
+Follow these steps while signed-in to the Azure portal to delete a resource group.
-Go to the [Azure portal](https://portal.azure.com) to see the app you created.
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Remove resource group Azure portal 1](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-1-240px.png" alt-text="A screenshot showing how to search for and navigate to a resource group in the Azure portal." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-1.png"::: |
+| [!INCLUDE [Remove resource group Azure portal 2](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-2-240px.png" alt-text="A screenshot showing the location of the Delete Resource Group button in the Azure portal." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-2.png"::: |
+| [!INCLUDE [Remove resource group Azure portal 3](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-3-240px.png" alt-text="A screenshot of the confirmation dialog for deleting a resource group in the Azure portal." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-3.png"::: |
-From the left menu, click **App Services**, then click the name of your Azure app.
+### [VS Code](#tab/vscode-aztools)
-![Portal navigation to Azure app](./media/tutorial-nodejs-mongodb-app/access-portal.png)
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Remove resource group VS Code 1](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-1-240px.png" alt-text="A screenshot showing how to delete a resource group in VS Code using the Azure Tools extention." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-1.png"::: |
+| [!INCLUDE [Remove resource group VS Code 2](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-2-240px.png" alt-text="A screenshot of the confirmation dialog for deleting a resource group from VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-2.png"::: |
-By default, the portal shows your app's **Overview** page. This page gives you a view of how your app is doing. Here, you can also perform basic management tasks like browse, stop, start, restart, and delete. The tabs on the left side of the page show the different configuration pages you can open.
+### [Azure CLI](#tab/azure-cli)
-![App Service page in Azure portal](./media/tutorial-nodejs-mongodb-app/web-app-blade.png)
+
-<a name="next"></a>
## Next steps
-What you learned:
-
-> [!div class="checklist"]
-> * Create a MongoDB database in Azure
-> * Connect a Node.js app to MongoDB
-> * Deploy the app to Azure
-> * Update the data model and redeploy the app
-> * Stream logs from Azure to your terminal
-> * Manage the app in the Azure portal
-
-Advance to the next tutorial to learn how to map a custom DNS name to the app.
-
-> [!div class="nextstepaction"]
-> [Map an existing custom DNS name to Azure App Service](app-service-web-tutorial-custom-domain.md)
-
-Or, check out other resources:
+> [!div class="nextstepaction"]
+> [JavaScript on Azure developer center](/azure/developer/javascript)
-- [Configure Node.js app](configure-language-nodejs.md)-- [Environment variables and app settings reference](reference-app-settings.md)
+> [!div class="nextstepaction"]
+> [Configure Node.js app in App Service](/azure/app-service/configure-language-nodejs)
automation Source Control Integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/source-control-integration.md
Source control integration lets you easily collaborate with your team, track changes, and roll back to earlier versions of your runbooks. For example, source control allows you to synchronize different branches in source control with your development, test, and production Automation accounts. > [!NOTE]
-> Source control synchronization jobs are run under the user's Automation account and are billed at the same rate as other Automation jobs.
+> Source control synchronization jobs are run under the user's Automation account and are billed at the same rate as other Automation jobs. Additionally, Azure Automation Jobs do not support MFA (Multi-Factor Authentication).
## Source control types
automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/troubleshoot/hybrid-runbook-worker.md
The following issues are possible causes:
* There's a mistyped workspace ID or workspace key (primary) in the agent's settings. * The Hybrid Runbook Worker can't download the configuration, which causes an account linking error. When Azure enables features on machines, it supports only certain regions for linking a Log Analytics workspace and an Automation account. It's also possible that an incorrect date or time is set on the computer. If the time is +/- 15 minutes from the current time, feature deployment fails.
+* Log Analytics Gateway is not configured to support Hybrid Runbook Worker.
#### Resolution
To verify if the agent's workspace ID or workspace key was mistyped, see [Adding
##### Configuration not downloaded
-Your Log Analytics workspace and Automation account must be in a linked region. For a list of supported regions, see [Azure Automation and Log Analytics workspace mappings](../how-to/region-mappings.md).
+Your Log Analytics workspace and Automation account must be in a linked region. This is the suggested solution for System Hybrid Runbook Worker used by Update Management. For a list of supported regions, see [Azure Automation and Log Analytics workspace mappings](../how-to/region-mappings.md).
You might also need to update the date or time zone of your computer. If you select a custom time range, make sure that the range is in UTC, which can differ from your local time zone.
+##### Log Analytics gateway not configured
+
+Follow the steps mentioned [here](/azure/azure-monitor/agents/gateway#configure-for-automation-hybrid-runbook-workers) to add Hybrid Runbook Worker endpoints to the Log Analytics Gateway.
++ ### <a name="set-azstorageblobcontent-execution-fails"></a>Scenario: Set-AzStorageBlobContent fails on a Hybrid Runbook Worker #### Issue
azure-arc Azure Data Studio Dashboards https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/azure-data-studio-dashboards.md
[Azure Data Studio](/sql/azure-data-studio/what-is) provides an experience similar to the Azure portal for viewing information about your Azure Arc resources. These views are called **dashboards** and have a layout and options similar to what you could see about a given resource in the Azure portal, but give you the flexibility of seeing that information locally in your environment in cases where you don't have a connection available to Azure. -
-## Connecting to a data controller
+## Connect to a data controller
### Prerequisites - Download [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio) - Azure Arc extension is installed
+### Connect
+
+1. Open Azure Data Studio.
+2. Select the **Connections** tab on the left.
+3. Expand the panel called **Azure Arc Controllers**.
+4. Select the **Connect Controller** button.
+ Azure Data Studio opens a blade on the right side.
-### Connect
+1. Enter the **Namespace** for the data controller.
-1. Open Azure Data Studio
-2. Select the **Connections** tab on the left
-3. Expand the panel called **Azure Arc Controllers**
-4. Click the **Connect Controller** button. This will open a blade on the right side
-5. By default, Azure Data Studio will try to read from the kube.config file in your default directory and list the available kubernetes cluster contexts and pre-select the current cluster context. If this is the right cluster to connect to, enter the namespace where the Azure Arc data controller is deployed in the input for **Namespace**. If you need to retrieve the namespace where the Azure Arc data controller is deployed, you can run ```kubectl get datacontrollers -A``` on your kubernetes cluster.
-6. Optionally add a display name for the Azure Arc data controller in the input for **Name**
-7. Select **Connect**
+ Azure Data Studio reads from the `kube.config` file in your default directory and lists the available Kubernetes cluster contexts. It selects the current cluster context. If this is the right cluster to connect to, use that namespace.
+ If you need to retrieve the namespace where the Azure Arc data controller is deployed, you can run `kubectl get datacontrollers -A` on your Kubernetes cluster.
-Now that you are connected to a data controller, you can view the dashboards for the data controller and any SQL managed instances or PostgreSQL Hyperscale server group resources that you have.
+6. Optionally add a display name for the Azure Arc data controller in the input for **Name**.
+7. Select **Connect**.
-## View the Data Controller dashboard
+
+After you connect to a data controller, you can view the dashboards. Azure Data Studio has dashboards for the data controller and any SQL managed instances or PostgreSQL Hyperscale server group resources that you have.
+
+## View the data controller dashboard
Right-click on the data controller in the Connections panel in the **Arc Controllers** expandable panel and choose **Manage**.
Conveniently, you can launch the creation of a SQL managed instance or PostgreSQ
You can also open the Azure portal in context to this data controller by clicking the Open in Azure portal button.
-## View the SQL managed instance dashboards
+## View the SQL Managed Instance dashboards
-If you have created some SQL managed instances, you can see them listed in the Connections panel in the Azure Data Controllers expandable panel underneath the data controller that is managing them.
+If you have created some SQL Managed Instances, see them listed under **Connections** in the **Azure Data Controllers** expandable panel underneath the data controller that is managing them.
-To view the SQL managed instance dashboard for a given instance, right-click on the instance and choose Manage.
+To view the SQL Managed Instance dashboard for a given instance, right-click on the instance and choose **Manage**.
-The Connection panel will pop up on the right and prompt you for the login/password to connect to that SQL instance. If you know the connection information you can enter it and click Connect. If you don't know, you can click Cancel. Either way, you will be brought to the dashboard when the Connection panel closes.
+The **Connection** panel prompts you for the login and password to connect to an instance. If you know the connection information you can enter it and choose **Connect**. If you don't know, choose **Cancel**. Either way, Azure Data Studio returns to the dashboard when the **Connection** panel closes.
-On the Overview tab you can view details about the SQL managed instance such as resource group, data controller, subscription ID, status, region and more. You can also see link that you can click to go into the Grafana or Kibana dashboards in context to that SQL managed instance.
+On the **Overview** tab, view resource group, data controller, subscription ID, status, region, and other information. This location also provides links to the Grafana dashboard for viewing metrics or Kibana dashboard for viewing logs in context to that SQL managed instance.
-If you are able to connect to the SQL manage instance, you can see additional information here.
+With a connection to the SQL manage instance, you can see additional information here.
You can delete the SQL managed instance from here or open the Azure portal to view the SQL managed instance in the Azure portal.
-If you click on the Connection Strings tab on the left, you can see a list of pre-constructed connection strings for that SQL managed instance making it easy for you to copy/paste into various other applications or code.
+If you click on the **Connection Strings** tab, the Azure Data Studio presents a list of pre-constructed connection strings for that instance making. Copy and paste these strings into various other applications or code.
## View the PostgreSQL Hyperscale server group dashboards
-If you have created some PostgreSQL Hyperscale server groups, you can see them listed in the Connections panel in the Azure Data Controllers expandable panel underneath the data controller that is managing them.
+If the deployment includes PostgreSQL Hyperscale server groups, Azure Data Studio lists them in the **Connections** panel in the **Azure Data Controllers** expandable panel underneath the data controller that is managing them.
To view the PostgreSQL Hyperscale server group dashboard for a given server group, right-click on the server group and choose Manage.
-On the Overview tab you can view details about the server group such as resource group, data controller, subscription ID, status, region and more. You can also see link that you can click to go into the Grafana or Kibana dashboards in context to that server group.
+On the **Overview** tab, review details about the server group such as resource group, data controller, subscription ID, status, region and more. The tab also has links to the Grafana dashboard for viewing metrics or Kibana dashboard for viewing logs in context to that server group.
You can delete the server group from here or open the Azure portal to view the server group in the Azure portal.
-If you click on the Connection Strings tab on the left, you can see a list of pre-constructed connection strings for that server group making it easy for you to copy/paste into various other applications or code.
+If you click on the **Connection Strings** tab on the left, Azure Data Studio provides pre-constructed connection strings for that server group. Copy and paste these strings to various other applications or code.
+
+Select the **Properties** tab on the left to see additional details.
+
+The **Resource health** tab on the left displays the current health of that server group.
-If you click on the Properties tab on the left, you can see additional details.
+The **Diagnose and solve problems** tab on the left, launches the PostgreSQL troubleshooting notebook.
-If you click on the Resource health tab on the left you can see the current high-level health of that server group.
+For Azure support, select the **New support request** tab. This launches the Azure portal in context to the server group. Create an Azure support request from there.
-If you click on the Diagnose and solve problems tab on the left, you can launch the PostgreSQL troubleshooting notebook.
+## Next steps
-If you click on the New support request tab on the left, you can launch the Azure portal in context to the server group and create an Azure support request from there.
+- [View SQL Managed Instance in the Azure portal](view-arc-data-services-inventory-in-azure-portal.md)
azure-arc Create Complete Managed Instance Directly Connected https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/create-complete-managed-instance-directly-connected.md
The next step is to create the data controller in directly connected mode via th
:::image type="content" source="media/create-complete-managed-instance-directly-connected/custom-location.png" alt-text="Create a new custom location and specify a namespace."::: 1. For **Kubernetes configuration template**, specify *azure-arc-aks-premium-storage* because this example uses an AKS cluster.
-1. Set a user name and password for the metrics and log services.
+2. For **Service type**, select **Load balancer**.
+3. Set a user name and password for the metrics and log services.
The passwords must be at least eight characters long and contain characters from three of the following four categories: Latin uppercase letters, Latin lowercase letters, numbers, and non-alphanumeric characters.
azure-arc Delete Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/delete-managed-instance.md
Name Replicas ServerEndpoint State
demo-mi 1/1 10.240.0.4:32023 Ready ```
-## Delete a Azure Arc-enabled SQL Managed Instance
+## Delete Azure Arc-enabled SQL Managed Instance
+ To delete a SQL Managed Instance, run the following command: ```azurecli
Deleted demo-mi from namespace arc
## Reclaim the Kubernetes Persistent Volume Claims (PVCs)
-Deleting a SQL Managed Instance does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will out of disk space. To reclaim the PVCs, take the following steps:
+A PersistentVolumeClaim (PVC) is a request for storage by a user from Kubernetes cluster while creating and adding storage to a SQL Managed Instance. Deleting a SQL Managed Instance does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will run out of disk space or usage of the same SQL Managed Instance name while creating new instance might cause inconsistencies. To reclaim the PVCs, take the following steps:
### 1. List the PVCs for the server group you deleted+ To list the PVCs, run the following command: ```console kubectl get pvc ```
-In the follow example below, notice the PVCs for the SQL Managed Instances you deleted.
+In the example below, notice the PVCs for the SQL Managed Instances you deleted.
+ ```console # kubectl get pvc -n arc
azure-arc Delete Postgresql Hyperscale Server Group https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/delete-postgresql-hyperscale-server-group.md
az postgres arc-server delete -n postgres01 --k8s-namespace <namespace> --use-k8
## Reclaim the Kubernetes Persistent Volume Claims (PVCs)
-Deleting a server group does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will think it's running out of disk space.
+A PersistentVolumeClaim (PVC) is a request for storage by a user from Kubernetes cluster while creating and adding storage to a PostgreSQL Hyperscale server group. Deleting a server group does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will think it's running out of disk space or usage of the same PostgreSQL Hyperscale server group name while creating new one might cause inconsistencies.
To reclaim the PVCs, take the following steps: ### 1. List the PVCs for the server group you deleted
azure-arc Monitor Grafana Kibana https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/monitor-grafana-kibana.md
# View logs and metrics using Kibana and Grafana
-Kibana and Grafana web dashboards are provided to bring insight and clarity to the Kubernetes namespaces being used by Azure Arc-enabled data services.
+Kibana and Grafana web dashboards are provided to bring insight and clarity to the Kubernetes namespaces being used by Azure Arc-enabled data services. To access Kibana and Grafana web dashboards view service endpoints check [Azure Data Studio dashboards](/azure/azure-arc/data/azure-data-studio-dashboards) documentation.
+
azure-arc Upgrade Sql Managed Instance Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/upgrade-sql-managed-instance-cli.md
Title: Upgrade an an indirectly connected Azure Arc-enabled Managed Instance using the CLI
+ Title: Upgrade an indirectly connected Azure Arc-enabled Managed Instance using the CLI
description: Article describes how to upgrade an indirectly connected Azure Arc-enabled Managed Instance using the CLI
Preparing to upgrade sql sqlmi-1 in namespace arc to data controller version.
### General Purpose
-During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency.
+During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency and [Retry Guidance for Azure Services](/azure/architecture/best-practices/retry-service-specific#sql-database-using-adonet).
To upgrade the Managed Instance, use the following command:
azure-arc Upgrade Sql Managed Instance Direct Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/upgrade-sql-managed-instance-direct-cli.md
Preparing to upgrade sql sqlmi-1 in namespace arc to data controller version.
### General Purpose
-During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency.
+During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency and [retry guidance for Azure Services](/azure/architecture/best-practices/retry-service-specific#sql-database-using-adonet).
To upgrade the Managed Instance, use the following command:
azure-functions Create First Function Vs Code Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/create-first-function-vs-code-python.md
Before you get started, make sure you have the following requirements in place:
+ The [Azure Functions Core Tools](functions-run-local.md#install-the-azure-functions-core-tools) version 3.x.
-+ [Python versions that are supported by Azure Functions](supported-languages.md#languages-by-runtime-version)
++ [Python versions that are supported by Azure Functions](supported-languages.md#languages-by-runtime-version). For more information, see [How to install Python](https://wiki.python.org/moin/BeginnersGuide/Download). + [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
azure-functions Functions How To Azure Devops https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-how-to-azure-devops.md
You can use the following sample to create a YAML file to build a .NET app:
```yaml pool:
- vmImage: 'windows-2019'
+ vmImage: 'windows-latest'
steps: - script: | dotnet restore
You can use the following sample to create a YAML file to build a JavaScript app
```yaml pool:
- vmImage: ubuntu-latest # Use 'windows-2019' if you have Windows native +Node modules
+ vmImage: ubuntu-latest # Use 'windows-latest' if you have Windows native +Node modules
steps: - bash: | if [ -f extensions.csproj ]
You can use the following sample to create a YAML file to package a PowerShell a
```yaml pool:
- vmImage: 'windows-2019'
+ vmImage: 'windows-latest'
steps: - task: ArchiveFiles@2 displayName: "Archive files"
azure-functions Functions Identity Access Azure Sql With Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-identity-access-azure-sql-with-managed-identity.md
# Tutorial: Connect a function app to Azure SQL with managed identity and SQL bindings
-Azure Functions provides a [managed identity](/azure/active-directory/managed-identities-azure-resources/overview.md), which is a turn-key solution for securing access to [Azure SQL Database](/azure/sql-database/) and other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizes [Azure SQL bindings](/azure/azure-functions/functions-bindings-azure-sql). A sample Azure Function project with SQL bindings is available in the [ToDo backend example](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/)).
+Azure Functions provides a [managed identity](/azure/active-directory/managed-identities-azure-resources/overview.md), which is a turn-key solution for securing access to [Azure SQL Database](/azure/sql-database/) and other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizes [Azure SQL bindings](/azure/azure-functions/functions-bindings-azure-sql). A sample Azure Function project with SQL bindings is available in the [ToDo backend example](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/).
When you're finished with this tutorial, your Azure Function will connect to Azure SQL database without the need of username and password.
azure-functions Functions Reference Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-reference-python.md
In this function, the value of the `name` query parameter is obtained from the `
Likewise, you can set the `status_code` and `headers` for the response message in the returned [HttpResponse] object.
+## Web frameworks
+
+You can leverage WSGI and ASGI-compatible frameworks such as Flask and FastAPI with your HTTP-triggered Python functions. This section shows how to modify your functions to support these frameworks.
+
+First, the function.json file must be updated to include a `route` in the HTTP trigger, as shown in the following example:
+
+```json
+{
+ "scriptFile": "__init__.py",
+ "bindings": [
+ {
+ "route": "test",
+ "authLevel": "anonymous",
+ "type": "httpTrigger",
+ "direction": "in",
+ "name": "req",
+ "methods": [
+ "get",
+ "post"
+ ]
+ },
+ {
+ "type": "http",
+ "direction": "out",
+ "name": "$return"
+ }
+ ]
+}
+```
+
+The host.json file must also be updated to include an HTTP `routePrefix`, as shown in the following example.
+
+```json
+{
+ "version": "2.0",
+ "logging": {
+ "applicationInsights": {
+ "samplingSettings": {
+ "isEnabled": true,
+ "excludedTypes": "Request"
+ }
+ },
+ "extensions": { "http": { "routePrefix": "" }}
+ },
+ "extensionBundle": {
+ "id": "Microsoft.Azure.Functions.ExtensionBundle",
+ "version": "[2.*, 3.0.0)"
+ }
+}
+```
+
+Update the Python code file `init.py`, depending on the interface used by your framework. The following example shows either an ASGI hander approach or a WSGI wrapper approach for Flask:
+
+# [ASGI](#tab/asgi)
+
+```python
+app=Flask("Test")
+
+@app.route("/api/HandleApproach")
+def test():
+ return "Hello!"
+
+def main(req: func.HttpRequest, context) -> func.HttpResponse:
+ logging.info('Python HTTP trigger function processed a request.')
+ return func.AsgiMiddleware(app).handle(req, context)
+```
+
+# [WSGI](#tab/wsgi)
+
+```python
+app=Flask("Test")
+
+@app.route("/api/WrapperApproach")
+def test():
+ return "Hello!"
+
+def main(req: func.HttpRequest, context) -> func.HttpResponse:
+ logging.info('Python HTTP trigger function processed a request.')
+ return func.WsgiMiddleware(app).handle(req, context)
+```
++++ ## Scaling and Performance For scaling and performance best practices for Python function apps, see the [Python scale and performance article](python-scale-performance-reference.md).
An extension that inherits from [FuncExtensionBase](https://github.com/Azure/azu
CORS is fully supported for Python function apps.
+## Async
+
+By default, a host instance for Python can process only one function invocation at a time. This is because Python is a single-threaded runtime. For a function app that processes a large number of I/O events or is being I/O bound, you can significantly improve performance by running functions asynchronously. For more information, see [Improve throughout performance of Python apps in Azure Functions](python-scale-performance-reference.md#async).
+ ## <a name="shared-memory"></a>Shared memory (preview) To improve throughput, Functions lets your out-of-process Python language worker share memory with the Functions host process. When your function app is hitting bottlenecks, you can enable shared memory by adding an application setting named [FUNCTIONS_WORKER_SHARED_MEMORY_DATA_TRANSFER_ENABLED](functions-app-settings.md#functions_worker_shared_memory_data_transfer_enabled) with a value of `1`. With shared memory enabled, you can then use the [DOCKER_SHM_SIZE](functions-app-settings.md#docker_shm_size) setting to set the shared memory to something like `268435456`, which is equivalent to 256 MB.
azure-monitor Alerts Log https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/alerts/alerts-log.md
You can also [create log alert rules using Azure Resource Manager templates](../
:::image type="content" source="media/alerts-log/alerts-rule-details-tab.png" alt-text="Details tab.":::
+> [!NOTE]
+> If you, or your administrator assigned the Azure Policy **Azure Log Search Alerts over Log Analytics workspaces should use customer-managed keys**, you must select **Check workspace linked storage** option in **Advanced options**, or the rule creation will fail as it will not meet the policy requirements.
+ 1. In the **Tags** tab, set any required tags on the alert rule resource. :::image type="content" source="media/alerts-log/alerts-rule-tags-tab.png" alt-text="Tags tab.":::
On success for creation, 201 is returned. On success for update, 200 is returned
* Learn about [log alerts](./alerts-unified-log.md). * Create log alerts using [Azure Resource Manager Templates](./alerts-log-create-templates.md). * Understand [webhook actions for log alerts](./alerts-log-webhook.md).
-* Learn more about [log queries](../logs/log-query-overview.md).
+* Learn more about [log queries](../logs/log-query-overview.md).
azure-monitor Profiler https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/profiler.md
Currently the only regions that require endpoint modifications are [Azure Govern
|ApplicationInsightsProfilerEndpoint | `https://profiler.monitor.azure.us` | `https://profiler.monitor.azure.cn` | |ApplicationInsightsEndpoint | `https://dc.applicationinsights.us` | `https://dc.applicationinsights.azure.cn` |
+## Enable Azure Active Directory authentication for profile ingestion
+
+Application Insights Profiler supports Azure AD authentication for profiles ingestion. This means, for all profiles of your application to be ingested, your application must be authenticated and provide the required application settings to the Profiler agent.
+
+As of today, Profiler only supports Azure AD authentication when you reference and configure Azure AD using the Application Insights SDK in your application.
+
+Below you can find all the steps required to enable Azure AD for profiles ingestion:
+1. Create and add the managed identity you want to use to authenticate against your Application Insights resource to your App Service.
+
+ a. For System-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-system-assigned-identity)
+
+ b. For User-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-user-assigned-identity)
+
+2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](https://docs.microsoft.com/azure/azure-monitor/app/azure-ad-authentication?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
+3. Add the following application setting, used to let Profiler agent know which managed identity to use:
+
+For System-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD |
+
+For User-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD;ClientId={Client id of the User-Assigned Identity} |
+ ## Disable Profiler To stop or restart Profiler for an individual app's instance, on the left sidebar, select **WebJobs** and stop the webjob named `ApplicationInsightsProfiler3`.
azure-monitor Snapshot Debugger Appservice https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/snapshot-debugger-appservice.md
Currently the only regions that require endpoint modifications are [Azure Govern
For more information about other connection overrides, see [Application Insights documentation](./sdk-connection-string.md?tabs=net#connection-string-with-explicit-endpoint-overrides).
+## Enable Azure Active Directory authentication for snapshot ingestion
+
+Application Insights Snapshot Debugger supports Azure AD authentication for snapshot ingestion. This means, for all snapshots of your application to be ingested, your application must be authenticated and provide the required application settings to the Snapshot Debugger agent.
+
+As of today, Snapshot Debugger only supports Azure AD authentication when you reference and configure Azure AD using the Application Insights SDK in your application.
+
+Below you can find all the steps required to enable Azure AD for profiles ingestion:
+1. Create and add the managed identity you want to use to authenticate against your Application Insights resource to your App Service.
+
+ a. For System-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-system-assigned-identity)
+
+ b. For User-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-user-assigned-identity)
+
+2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](https://docs.microsoft.com/azure/azure-monitor/app/azure-ad-authentication?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
+3. Add the following application setting, used to let Snapshot Debugger agent know which managed identity to use:
+
+For System-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD |
+
+For User-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD;ClientId={Client id of the User-Assigned Identity} |
+ ## Disable Snapshot Debugger Follow the same steps as for **Enable Snapshot Debugger**, but switch both switches for Snapshot Debugger to **Off**.
azure-monitor Activity Log https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/essentials/activity-log.md
# Azure Activity log
-The Activity log is a [platform log](./platform-logs-overview.md) in Azure that provides insight into subscription-level events. This includes such information as when a resource is modified or when a virtual machine is started. You can view the Activity log in the Azure portal or retrieve entries with PowerShell and CLI. This article provides details on viewing the Activity log and sending it to different destinations.
+The Activity log is a [platform log](./platform-logs-overview.md) in Azure that provides insight into subscription-level events. Activity log includes such information as when a resource is modified or when a virtual machine is started. You can view the Activity sign in the Azure portal or retrieve entries with PowerShell and CLI. This article provides details on viewing the Activity log and sending it to different destinations.
-For additional functionality, you should create a diagnostic setting to send the Activity log to one or more of these locations for the following reasons:
+For more functionality, you should create a diagnostic setting to send the Activity log to one or more of these locations for the following reasons:
- to [Azure Monitor Logs](../logs/data-platform-logs.md) for more complex querying and alerting, and longer retention (up to 2 years) - to Azure Event Hubs to forward outside of Azure - to Azure Storage for cheaper, long-term archiving
See [Create diagnostic settings to send platform logs and metrics to different d
## Retention Period
-Activity log events are retained in Azure for **90 days** and then deleted. There is no charge for entries during this time regardless of volume. For additional functionality such as longer retention, you should create a diagnostic setting and route the entires to another location based on your needs. See the criteria in the earlier section of this article.
+Activity log events are retained in Azure for **90 days** and then deleted. There is no charge for entries during this time regardless of volume. For more functionality such as longer retention, you should create a diagnostic setting and route the entires to another location based on your needs. See the criteria in the earlier section of this article.
## View the Activity log
-You can access the Activity log from most menus in the Azure portal. The menu that you open it from determines its initial filter. If you open it from the **Monitor** menu, then the only filter will be on the subscription. If you open it from a resource's menu, then the filter will be set to that resource. You can always change the filter though to view all other entries. Click **Add Filter** to add additional properties to the filter.
+You can access the Activity log from most menus in the Azure portal. The menu that you open it from determines its initial filter. If you open it from the **Monitor** menu, then the only filter will be on the subscription. If you open it from a resource's menu, then the filter is set to that resource. You can always change the filter though to view all other entries. Select **Add Filter** to add more properties to the filter.
![View Activity Log](./media/activity-log/view-activity-log.png)
For some events, you can view the Change history, which shows what changes happe
![Change history list for an event](media/activity-log/change-history-event.png)
-If there are any associated changes with the event, you'll see a list of changes that you can select. This opens up the **Change history (Preview)** page. On this page, you see the changes to the resource. In the following example, you can see not only that the VM changed sizes, but what the previous VM size was before the change and what it was changed to. To learn more about change history, see [Get resource changes](../../governance/resource-graph/how-to/get-resource-changes.md).
+If there are any associated changes with the event, you will see a list of changes that you can select. This opens up the **Change history (Preview)** page. On this page, you see the changes to the resource. In the following example, you can see not only that the VM changed sizes, but what the previous VM size was before the change and what it was changed to. To learn more about change history, see [Get resource changes](../../governance/resource-graph/how-to/get-resource-changes.md).
![Change history page showing differences](media/activity-log/change-history-event-details.png) ### Other methods to retrieve Activity log events
-You can also access Activity log events using the following methods.
+You can also access Activity log events using the following methods:
- Use the [Get-AzLog](/powershell/module/az.monitor/get-azlog) cmdlet to retrieve the Activity Log from PowerShell. See [Azure Monitor PowerShell samples](../powershell-samples.md#retrieve-activity-log). - Use [az monitor activity-log](/cli/azure/monitor/activity-log) to retrieve the Activity Log from CLI. See [Azure Monitor CLI samples](../cli-samples.md#view-activity-log).
You can also access Activity log events using the following methods.
- No data ingestion charges for Activity log data stored in a Log Analytics workspace. - No data retention charges for the first 90 days for Activity log data stored in a Log Analytics workspace.
+ Select **Export Activity Logs**.
-[Create a diagnostic setting](./diagnostic-settings.md) to send the Activity log to a Log Analytics workspace. You can send the Activity log from any single subscription to up to five workspaces.
+ ![Export activity logs](media/activity-log/diagnostic-settings-export.png)
+
+to send the Activity log to a Log Analytics workspace. You can send the Activity log from any single subscription to up to five workspaces.
Activity log data in a Log Analytics workspace is stored in a table called *AzureActivity* that you can retrieve with a [log query](../logs/log-query-overview.md) in [Log Analytics](../logs/log-analytics-tutorial.md). The structure of this table varies depending on the [category of the log entry](activity-log-schema.md). For a description of the table properties, see the [Azure Monitor data reference](/azure/azure-monitor/reference/tables/azureactivity).
-For example, to view a count of Activity log records for each category, use the following query.
+For example, to view a count of Activity log records for each category, use the following query:
```kusto AzureActivity | summarize count() by CategoryValue ```
-To retrieve all records in the administrative category, use the following query.
+To retrieve all records in the administrative category, use the following query:
```kusto AzureActivity
Following is sample output data from Event Hubs for an Activity log:
``` ## Send to Azure storage
-Send the Activity Log to an Azure Storage Account if you want to retain your log data longer than 90 days for audit, static analysis, or backup. If you only need to retain your events for 90 days or less you do not need to set up archival to a Storage Account, since Activity Log events are retained in the Azure platform for 90 days.
+Send the Activity Log to an Azure Storage Account if you want to retain your log data longer than 90 days for audit, static analysis, or backup. If you only must retain your events for 90 days or less you do not need to set up archival to a Storage Account, since Activity Log events are retained in the Azure platform for 90 days.
When you send the Activity log to Azure, a storage container is created in the Storage Account as soon as an event occurs. The blobs in the container use the following naming convention:
This section describes legacy methods for collecting the Activity log that were
Log profiles are the legacy method for sending the Activity log to Azure storage or Event Hubs. Use the following procedure to continue working with a log profile or to disable it in preparation for migrating to a diagnostic setting. 1. From the **Azure Monitor** menu in the Azure portal, select **Activity log**.
-3. Click **Diagnostic settings**.
+3. Select **Export Activity Logs**.
- ![Diagnostic settings](media/activity-log/diagnostic-settings.png)
+ ![Export activity logs](media/activity-log/diagnostic-settings-export.png)
-4. Click the purple banner for the legacy experience.
+4. Select the purple banner for the legacy experience.
![Legacy experience](media/activity-log/legacy-experience.png) ### Configure log profile using PowerShell
-If a log profile already exists, you first need to remove the existing log profile and then create a new one.
+If a log profile already exists, you first must remove the existing log profile and then create new one.
1. Use `Get-AzLogProfile` to identify if a log profile exists. If a log profile does exist, note the *name* property.
If a log profile already exists, you first need to remove the existing log profi
| StorageAccountId |No |Resource ID of the Storage Account where the Activity Log should be saved. | | serviceBusRuleId |No |Service Bus Rule ID for the Service Bus namespace you would like to have Event Hubs created in. This is a string with the format: `{service bus resource ID}/authorizationrules/{key name}`. | | Location |Yes |Comma-separated list of regions for which you would like to collect Activity Log events. |
- | RetentionInDays |Yes |Number of days for which events should be retained in the Storage Account, between 1 and 365. A value of zero stores the logs indefinitely. |
+ | RetentionInDays |Yes |Number of days for which events should be retained in the Storage Account, from 1 through 365. A value of zero stores the logs indefinitely. |
| Category |No |Comma-separated list of event categories that should be collected. Possible values are _Write_, _Delete_, and _Action_. | ### Example script
Following is a sample PowerShell script to create a log profile that writes the
### Configure log profile using Azure CLI
-If a log profile already exists, you first need to remove the existing log profile and then create a new log profile.
+If a log profile already exists, you first must remove the existing log profile and then create a log profile.
1. Use `az monitor log-profiles list` to identify if a log profile exists. 2. Use `az monitor log-profiles delete --name "<log profile name>` to remove the log profile using the value from the *name* property.
-3. Use `az monitor log-profiles create` to create a new log profile:
+3. Use `az monitor log-profiles create` to create a log profile:
```azurecli-interactive az monitor log-profiles create --name "default" --location null --locations "global" "eastus" "westus" --categories "Delete" "Write" "Action" --enabled false --days 0 --service-bus-rule-id "/subscriptions/<YOUR SUBSCRIPTION ID>/resourceGroups/<RESOURCE GROUP NAME>/providers/Microsoft.EventHub/namespaces/<Event Hub NAME SPACE>/authorizationrules/RootManageSharedAccessKey"
If a log profile already exists, you first need to remove the existing log profi
| name |Yes |Name of your log profile. | | storage-account-id |Yes |Resource ID of the Storage Account to which Activity Logs should be saved. | | locations |Yes |Space-separated list of regions for which you would like to collect Activity Log events. You can view a list of all regions for your subscription using `az account list-locations --query [].name`. |
- | days |Yes |Number of days for which events should be retained, between 1 and 365. A value of zero will store the logs indefinitely (forever). If zero, then the enabled parameter should be set to false. |
+ | days |Yes |Number of days for which events should be retained, from 1 through 365. A value of zero will store the logs indefinitely (forever). If zero, then the enabled parameter should be set to false. |
|enabled | Yes |True or False. Used to enable or disable the retention policy. If True, then the days parameter must be a value greater than 0. | categories |Yes |Space-separated list of event categories that should be collected. Possible values are Write, Delete, and Action. | ### Log Analytics workspace
-The legacy method for sending the Activity log into a Log Analytics workspace is connecting the log in the workspace configuration.
+The legacy method for sending the Activity log into a Log Analytics workspace is connecting the sign in the workspace configuration.
1. From the **Log Analytics workspaces** menu in the Azure portal, select the workspace to collect the Activity Log. 1. In the **Workspace Data Sources** section of the workspace's menu, select **Azure Activity log**.
-1. Click the subscription you want to connect.
+1. Select the subscription that you want to connect.
![Screenshot shows Log Analytics workspace with an Azure Activity log selected.](media/activity-log/workspaces.png)
-2. Click **Connect** to connect the Activity log in the subscription to the selected workspace. If the subscription is already connected to another workspace, click **Disconnect** first to disconnect it.
+2. Select **Connect** to connect the Activity sign in the subscription to the selected workspace. If the subscription is already connected to another workspace, select **Disconnect** first to disconnect it.
![Connect Workspaces](media/activity-log/connect-workspace.png)
-To disable the setting, perform the same procedure and click **Disconnect** to remove the subscription from the workspace.
+To disable the setting, perform the same procedure and select **Disconnect** to remove the subscription from the workspace.
### Data structure changes
-Diagnostic settings send the same data as the legacy method used to send the Activity log with some changes to the structure of the *AzureActivity* table.
+The Export activity logs experience, sends the same data as the legacy method used to send the Activity log with some changes to the structure of the *AzureActivity* table.
-The columns in the following table have been deprecated in the updated schema. They still exist in *AzureActivity* but they will have no data. The replacements for these columns are not new, but they contain the same data as the deprecated column. They are in a different format, so you may need to modify log queries that use them.
+The columns in the following table have been deprecated in the updated schema. They still exist in *AzureActivity* but they have no data. The replacements for these columns are not new, but they contain the same data as the deprecated column. They are in a different format, so you might need to modify log queries that use them.
|Activity Log JSON | Log Analytics column name<br/>*(older deprecated)* | New Log Analytics column name | Notes | |:|:|:|:|
The columns in the following table have been deprecated in the updated schema. T
> [!Important] > In some cases, the values in these columns may be in all uppercase. If you have a query that includes these columns, you should use the [=~ operator](/azure/kusto/query/datatypes-string-operators) to do a case insensitive comparison.
-The following column have been added to *AzureActivity* in the updated schema:
+The following columns have been added to *AzureActivity* in the updated schema:
- Authorization_d - Claims_d
Monitoring solutions are accessed from the **Monitor** menu in the Azure portal.
![Azure Activity Logs tile](media/activity-log/azure-activity-logs-tile.png)
-Click the **Azure Activity Logs** tile to open the **Azure Activity Logs** view. The view includes the visualization parts in the following table. Each part lists up to 10 items matching that part's criteria for the specified time range. You can run a log query that returns all matching records by clicking **See all** at the bottom of the part.
+Select the **Azure Activity Logs** tile to open the **Azure Activity Logs** view. The view includes the visualization parts in the table. Each part lists up to 10 items that matches that part's criteria for the specified time range. You can run a log query that returns all matching records by clicking **See all** at the bottom of the part.
![Azure Activity Logs dashboard](media/activity-log/activity-log-dash.png) ### Enable the solution for new subscriptions > [!NOTE]
->You will soon no longer be able to add the Activity Logs Analytics solution to your subscription using the Azure portal. You can add it using the following procedure with a Resource Manager template.
+>You will soon no longer be able to add the Activity Logs Analytics solution to your subscription with the Azure portal. You can add it using the following procedure with a Resource Manager template.
1. Copy the following json into a file called *ActivityLogTemplate*.json.
Click the **Azure Activity Logs** tile to open the **Azure Activity Logs** view.
## Next steps- * [Read an overview of platform logs](./platform-logs-overview.md) * [Review Activity log event schema](activity-log-schema.md)
-* [Create diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
+* [Create diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
azure-monitor Alert Management Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/insights/alert-management-solution.md
Title: Alert Management solution in Azure Log Analytics | Microsoft Docs description: The Alert Management solution in Log Analytics helps you analyze all of the alerts in your environment. In addition to consolidating alerts generated within Log Analytics, it imports alerts from connected System Center Operations Manager management groups into Log Analytics. -- Previously updated : 01/19/2018++ Last updated : 01/02/2022
Last updated 01/19/2018
![Alert Management icon](media/alert-management-solution/icon.png)
+> [!CAUTION]
+> This solution is no longer in active development and may not work as expected. We suggest you try using [Azure Resource Graph to query Azure Monitor alerts](../alerts/alerts-overview.md#manage-your-alert-instances-programmatically).
+ The Alert Management solution helps you analyze all of the alerts in your Log Analytics repository. These alerts may have come from a variety of sources including those sources [created by Log Analytics](../alerts/alerts-overview.md) or [imported from Nagios or Zabbix](../vm/monitor-virtual-machine.md). The solution also imports alerts from any [connected System Center Operations Manager management groups](../agents/om-agents.md). ## Prerequisites
The following table provides sample log searches for alert records collected by
## Next steps
-* Learn about [Alerts in Log Analytics](../alerts/alerts-overview.md) for details on generating alerts from Log Analytics.
+* Learn about [Alerts in Log Analytics](../alerts/alerts-overview.md) for details on generating alerts from Log Analytics.
azure-monitor Monitor Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/monitor-reference.md
description: Reference of all services and other resources monitored by Azure Mo
Previously updated : 11/02/2021 Last updated : 11/10/2021
The table below lists the available curated visualizations and more detailed inf
|Name with docs link| State | [Azure portal Link](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/more)| Description | |:--|:--|:--|:--|
-| [Azure Monitor Workbooks for Azure Active Directory](../active-directory/reports-monitoring/howto-use-azure-monitor-workbooks.md) | GA (General availability) | [Yes](https://ms.portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
-| [Azure Backup Insights](../backup/backup-azure-monitoring-use-azuremonitor.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
-| [Azure Monitor for Azure Cache for Redis (preview)](./insights/redis-cache-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
-| [Azure Cosmos DB Insights](./insights/cosmosdb-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
-| [Azure Data Explorer insights](./insights/data-explorer.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. |
-| [Azure HDInsight (preview)](../hdinsight/log-analytics-migration.md#insights) | Preview | No | An Azure Monitor workbook that collects important performance metrics from your HDInsight cluster and provides the visualizations and dashboards for most common scenarios. Gives a complete view of a single HDInsight cluster including resource utilization and application status|
- | [Azure IoT Edge Insights](../iot-edge/how-to-explore-curated-visualizations.md) | GA | No | Visualize and explore metrics collected from the IoT Edge device right in the Azure portal using Azure Monitor Workbooks based public templates. The curated workbooks use built-in metrics from the IoT Edge runtime. These views don't need any metrics instrumentation from the workload modules. |
- | [Azure Key Vault Insights (preview)](./insights/key-vault-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/keyvaultsInsights) | Provides comprehensive monitoring of your key vaults by delivering a unified view of your Key Vault requests, performance, failures, and latency. |
- | [Azure Monitor Application Insights](./app/app-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/applicationsInsights) | Extensible Application Performance Management (APM) service which monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations. It enables you to diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes. |
- | [Azure Monitor Log Analytics Workspace](./logs/log-analytics-workspace-insights-overview.md) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/lawsInsights) | Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview). |
- | [Azure Service Bus](/azure/service-bus/) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | |
- | [Azure SQL insights](./insights/sql-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/sqlWorkloadInsights) | A comprehensive interface for monitoring any product in the Azure SQL family. SQL insights uses dynamic management views to expose the data you need to monitor health, diagnose problems, and tune performance. Note: If you are just setting up SQL monitoring, use this instead of the SQL Analytics solution. |
+| [Azure Monitor Workbooks for Azure Active Directory](/azure/active-directory/reports-monitoring/howto-use-azure-monitor-workbooks) | GA (General availability) | [Yes](https://ms.portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
+| [Azure Backup](/azure/backup/backup-azure-monitoring-use-azuremonitor) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
+| [Azure Monitor for Azure Cache for Redis (preview)](/azure/azure-monitor/insights/redis-cache-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
+| [Azure Cosmos DB Insights](/azure/azure-monitor/insights/cosmosdb-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
+| [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. |
+| [Azure Data Explorer insights](/azure/azure-monitor/insights/data-explorer) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. |
+| [Azure HDInsight (preview)](/azure/hdinsight/log-analytics-migration#insights) | Preview | No | An Azure Monitor workbook that collects important performance metrics from your HDInsight cluster and provides the visualizations and dashboards for most common scenarios. Gives a complete view of a single HDInsight cluster including resource utilization and application status|
+ | [Azure IoT Edge](/azure/iot-edge/how-to-explore-curated-visualizations/) | GA | No | Visualize and explore metrics collected from the IoT Edge device right in the Azure portal using Azure Monitor Workbooks based public templates. The curated workbooks use built-in metrics from the IoT Edge runtime. These views don't need any metrics instrumentation from the workload modules. |
+ | [Azure Key Vault Insights (preview)](/azure/azure-monitor/insights/key-vault-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/keyvaultsInsights) | Provides comprehensive monitoring of your key vaults by delivering a unified view of your Key Vault requests, performance, failures, and latency. |
+ | [Azure Monitor Application Insights](/azure/azure-monitor/app/app-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/applicationsInsights) | Extensible Application Performance Management (APM) service which monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations. It enables you to diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes. |
+ | [Azure Monitor Log Analytics Workspace](/azure/azure-monitor/logs/log-analytics-workspace-insights-overview) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/lawsInsights) | Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview). |
+ | [Azure Service Bus Insights](/azure/service-bus-messaging/monitor-service-bus) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | |
+ | [Azure SQL insights](/azure/azure-monitor/insights/sql-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/sqlWorkloadInsights) | A comprehensive interface for monitoring any product in the Azure SQL family. SQL insights uses dynamic management views to expose the data you need to monitor health, diagnose problems, and tune performance. Note: If you are just setting up SQL monitoring, use this instead of the SQL Analytics solution. |
| [Azure Storage Insights](/azure/azure-monitor/insights/storage-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/storageInsights) | Provides comprehensive monitoring of your Azure Storage accounts by delivering a unified view of your Azure Storage services performance, capacity, and availability. |
- | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
- | [Azure Network Insights](./insights/network-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
- | [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. |
- | [Azure Monitor for Resource Groups](./insights/resource-group-insights.md) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
- | [Azure Monitor SAP](../virtual-machines/workloads/sap/monitor-sap-on-azure.md) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
+ | [Azure Network Insights](/azure/azure-monitor/insights/network-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
+ | [Azure Monitor for Resource Groups](/azure/azure-monitor/insights/resource-group-insights) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
+ | [Azure Monitor SAP](/azure/virtual-machines/workloads/sap/monitor-sap-on-azure) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
| [Azure Stack HCI insights](/azure-stack/hci/manage/azure-stack-hci-insights) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/azureStackHCIInsights) | Azure Monitor Workbook based. Provides health, performance, and usage insights about registered Azure Stack HCI, version 21H2 clusters that are connected to Azure and are enrolled in monitoring. It stores its data in a Log Analytics workspace, which allows it to deliver powerful aggregation and filtering and analyze data trends over time. |
- | [Windows Virtual Desktop Insights](../virtual-desktop/azure-monitor.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Monitor for Windows Virtual Desktop (preview) is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Windows Virtual Desktop environments. This topic will walk you through how to set up Azure Monitor for Windows Virtual Desktop to monitor your Windows Virtual Desktop environments. |
+ | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
+ | [Azure Virtual Desktop Insights](/azure/virtual-desktop/azure-monitor) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Virtual Desktop Insights is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Windows Virtual Desktop environments. |
## Product integrations
azure-monitor View Designer Filters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer-filters.md
Title: Filters in Azure Monitor views | Microsoft Docs description: A filter in an Azure Monitor view allows users to filter the data in the view by the value of a particular property without modifying the view itself. This article describes how to use a filter and add one to a custom view. -- Last updated 06/22/2018 # Filters in Azure Monitor views+
+> [!IMPORTANT]
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+ A **filter** in an [Azure Monitor view](view-designer.md) allows users to filter the data in the view by the value of a particular property without modifying the view itself. For example, you could allow users of your view to filter the view for data only from a particular computer or set of computers. You can create multiple filters on a single view to allow users to filter by multiple properties. This article describes how to use a filter and add one to a custom view. ## Using a filter
azure-monitor View Designer Parts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer-parts.md
Title: A reference guide to the View Designer parts in Azure Monitor | Microsoft Docs description: By using View Designer in Azure Monitor, you can create custom views that are displayed in the Azure portal and contain a variety of visualizations on data in the Log Analytics workspace. This article is a reference guide to the settings for the visualization parts that are available in your custom views. -- Last updated 03/12/2018 # Reference guide to View Designer visualization parts in Azure Monitor+
+> [!IMPORTANT]
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+ By using View Designer in Azure Monitor, you can create a variety of custom views in the Azure portal that can help you visualize data in your Log Analytics workspace. This article is a reference guide to the settings for the visualization parts that are available in your custom views. For more information about View Designer, see:
The following table describes the settings for thresholds:
| Color |The color that indicates the threshold value. | ## Next steps
-* Learn about [log queries](../logs/log-query-overview.md) to support the queries in visualization parts.
+* Learn about [log queries](../logs/log-query-overview.md) to support the queries in visualization parts.
azure-monitor View Designer Tiles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer-tiles.md
Title: A reference guide to the View Designer tiles in Azure Monitor | Microsoft Docs description: By using View Designer in Azure Monitor, you can create custom views that are displayed in the Azure portal and contain a variety of visualizations on data in the Log Analytics workspace. This article is a reference guide to the settings for the tiles that are available in your custom views. -- Last updated 01/17/2018 - # Reference guide to View Designer tiles in Azure Monitor+
+> [!IMPORTANT]
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+ By using View Designer in Azure Monitor, you can create a variety of custom views in the Azure portal that can help you visualize data in your Log Analytics workspace. This article is a reference guide to the settings for the tiles that are available in your custom views. For more information about View Designer, see:
The **Two timelines** tile displays the results of two log queries over time as
## Next steps * Learn about [log queries](../logs/log-query-overview.md) to support the queries in tiles.
-* Add [visualization parts](view-designer-parts.md) to your custom view.
+* Add [visualization parts](view-designer-parts.md) to your custom view.
azure-monitor View Designer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer.md
Title: Create views to analyze log data in Azure Monitor | Microsoft Docs description: By using View Designer in Azure Monitor, you can create custom views that are displayed in the Azure portal and contain a variety of visualizations on data in the Log Analytics workspace. This article contains an overview of View Designer and presents procedures for creating and editing custom views. --++ Last updated 08/04/2020
Last updated 08/04/2020
By using View Designer in Azure Monitor, you can create a variety of custom views in the Azure portal that can help you visualize data in your Log Analytics workspace. This article presents an overview of View Designer and procedures for creating and editing custom views. > [!IMPORTANT]
-> Views in Azure Monitor have been transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
The options for working with views in edit mode are described in the following t
## Next steps * Add [Tiles](view-designer-tiles.md) to your custom view.
-* Add [Visualization parts](view-designer-parts.md) to your custom view.
+* Add [Visualization parts](view-designer-parts.md) to your custom view.
azure-portal How To Create Azure Support Request https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-portal/supportability/how-to-create-azure-support-request.md
Title: How to create an Azure support request
description: Customers who need assistance can use the Azure portal to find self-service solutions and to create and manage support requests. Previously updated : 12/07/2021 Last updated : 02/01/2022 # Create an Azure support request
You can get to **Help + support** in the Azure portal. It's available from the A
### Azure role-based access control
-To create a support request, you must be an [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor) or be assigned to the [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role at the subscription level. To create a support request without a subscription, for example an Azure Active Directory scenario, you must be an [Admin](../../active-directory/roles/permissions-reference.md).
+To create a support request, you must have the [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role, or a custom role with [Microsoft.Support/*](/azure/role-based-access-control/resource-provider-operations#microsoftsupport), at the subscription level.
+
+To create a support request without a subscription, for example an Azure Active Directory scenario, you must be an [Admin](../../active-directory/roles/permissions-reference.md).
> [!IMPORTANT]
-> If a support request requires investigation into multiple subscriptions, you must have Owner, Contributor, or Support Request Contributor role for each subscription involved.
+> If a support request requires investigation into multiple subscriptions, you must have the required access for each subscription involved ([Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Reader](../../role-based-access-control/built-in-roles.md#reader), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor), or a custom role with the [Microsoft.Support/supportTickets/read](/azure/role-based-access-control/resource-provider-operations#microsoftsupport) permission).
### Go to Help + support from the global header
azure-resource-manager Modules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/modules.md
Title: Bicep modules description: Describes how to define a module in a Bicep file, and how to use module scopes. Previously updated : 11/19/2021 Last updated : 02/01/2022 # Bicep modules
Bicep enables you to organize deployments into modules. A module is just a Bicep
To share modules with other people in your organization, create a [template spec](../templates/template-specs.md) or [private registry](private-module-registry.md). Template specs and modules in the registry are only available to users with the correct permissions. > [!TIP]
-> The choice between template specs and private registries is mostly a matter of preference. If you're deploying templates or Bicep files without other project artifacts, template specs are an easier option. If you're deploying project artifacts with the templates or Bicep files, you can integrate the private registry with your development work and then more easily deploy all of it from the registry.
+> The choice between module registry and template specs is mostly a matter of preference. There are a few things to consider when you choose between the two:
+>
+> - Module registry is only supported by Bicep. If you are not yet using Bicep, use template specs.
+> - Content in the Bicep module registry can only be deployed from another Bicep file. Template specs can be deployed directly from the API, Azure PowerShell, Azure CLI, and the Azure portal. You can even use [`UiFormDefinition`](../templates/template-specs-create-portal-forms.md) to customize the portal deployment experience.
+> - Bicep has some limited capabilities for embedding other project artifacts (including non-Bicep and non-ARM-template files. For example, PowerShell scripts, CLI scripts and other binaries) by using the [`loadTextContent`](./bicep-functions-files.md#loadtextcontent) and [`loadFileAsBase64`](./bicep-functions-files.md#loadfileasbase64) functions. Template specs can't package these artifacts.
Bicep modules are converted into a single Azure Resource Manager template with [nested templates](../templates/linked-templates.md#nested-template).
azure-resource-manager Template Specs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/template-specs.md
Title: Create & deploy template specs in Bicep description: Describes how to create template specs in Bicep and share them with other users in your organization. Previously updated : 01/19/2022 Last updated : 02/01/2022 # Azure Resource Manager template specs in Bicep
To deploy the template spec, you use standard Azure tools like PowerShell, Azure
When designing your deployment, always consider the lifecycle of the resources and group the resources that share similar lifecycle into a single template spec. For instance, your deployments include multiple instances of Cosmos DB with each instance containing its own databases and containers. Given the databases and the containers donΓÇÖt change much, you want to create one template spec to include a Cosmo DB instance and its underlying databases and containers. You can then use conditional statements in your Bicep along with copy loops to create multiple instances of these resources.
-The choice between template specs and [private module registries](./private-module-registry.md) is mostly a matter of preference. If you're deploying templates or Bicep files without other project artifacts, template specs are an easier option. If you're deploying project artifacts with the templates or Bicep files, you can integrate the private registry with your development work and then more easily deploy all of it from the registry.
+> [!TIP]
+> The choice between module registry and template specs is mostly a matter of preference. There are a few things to consider when you choose between the two:
+>
+> - Module registry is only supported by Bicep. If you are not yet using Bicep, use template specs.
+> - Content in the Bicep module registry can only be deployed from another Bicep file. Template specs can be deployed directly from the API, Azure PowerShell, Azure CLI, and the Azure portal. You can even use [`UiFormDefinition`](../templates/template-specs-create-portal-forms.md) to customize the portal deployment experience.
+> - Bicep has some limited capabilities for embedding other project artifacts (including non-Bicep and non-ARM-template files. For example, PowerShell scripts, CLI scripts and other binaries) by using the [`loadTextContent`](./bicep-functions-files.md#loadtextcontent) and [`loadFileAsBase64`](./bicep-functions-files.md#loadfileasbase64) functions. Template specs can't package these artifacts.
### Microsoft Learn
azure-resource-manager Deploy Rest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deploy-rest.md
Title: Deploy resources with REST API and template description: Use Azure Resource Manager and Resource Manager REST API to deploy resources to Azure. The resources are defined in a Resource Manager template. Previously updated : 10/22/2020 Last updated : 02/01/2022 # Deploy resources with ARM templates and Azure Resource Manager REST API
The examples in this article use resource group deployments.
GET https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.Resources/deployments/{deploymentName}?api-version=2020-10-01 ```
+## Deploy with ARMClient
+
+ARMClient is a simple command line tool to invoke the Azure Resource Manager API. To install the tool, see [ARMClient](https://github.com/projectkudu/ARMClient).
+
+To list your subscriptions:
+
+```cmd
+armclient GET /subscriptions?api-version=2021-04-01
+```
+
+To list your resource groups:
+
+```cmd
+armclient GET /subscriptions/<subscription-id>/resourceGroups?api-version=2021-04-01
+```
+
+Replace **&lt;subscription-id>** with your Azure subscription ID.
+
+To create a resource group in the *Central US* region:
+
+```cmd
+armclient PUT /subscriptions/<subscription-id>/resourceGroups/<resource-group-name>?api-version=2021-04-01 "{location: 'central us', properties: {}}"
+```
+
+Alternatively, you can put the body into a JSON file called **CreateRg.json**:
+
+```json
+{
+ "location": "Central US",
+ "properties": { }
+}
+```
+
+```cmd
+armclient PUT /subscriptions/<subscription-id>/resourceGroups/<resource-group-name>?api-version=2021-04-01 '@CreateRg.json'
+```
+
+For more information, see [ARMClient: a command line tool for the Azure API](http://blog.davidebbo.com/2015/01/azure-resource-manager-client.html).
+ ## Deployment name You can give your deployment a name such as `ExampleDeployment`.
azure-sql Connectivity Settings https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/connectivity-settings.md
The connectivity settings are accessible from the **Firewalls and virtual networ
## Deny public network access
-The defaukt for this setting is **No** so that customers can connect by using either public endpoints (with IP-based server- level firewall rules or with virtual-network firewall rules) or private endpoints (by using Azure Private Link), as outlined in the [network access overview](network-access-controls-overview.md).
+The default for this setting is **No** so that customers can connect by using either public endpoints (with IP-based server- level firewall rules or with virtual-network firewall rules) or private endpoints (by using Azure Private Link), as outlined in the [network access overview](network-access-controls-overview.md).
When **Deny public network access** is set to **Yes**, only connections via private endpoints are allowed. All connections via public endpoints will be denied with an error message similar to:
azure-sql Resource Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/resource-limits.md
Support for the premium-series hardware generations (public preview) is currentl
| Australia East | Yes | Yes | | Canada Central | Yes | | | Canada East | Yes | |
+| Central US | Yes | |
+| East US | Yes | |
| East US 2 | Yes | | | France Central | | Yes | | Germany West Central | | Yes |
azure-sql Sql Assessment For Sql Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/sql-assessment-for-sql-vm.md
The SQL Assessment feature of the Azure portal identifies possible performance i
The SQL Assessment feature is currently in preview.
+To learn more, watch this video on [SQL Assessment](/shows/Data-Exposed/?WT.mc_id=dataexposed-c9-niner):
+<iframe src="https://aka.ms/docs/player?id=13b2bf63-485c-4ec2-ab14-a1217734ad9f" width="640" height="370" style="border: 0; max-width: 100%; min-width: 100%;"></iframe>
+++ ## Overview Once the SQL Assessment feature is enabled, your SQL Server instance and databases are scanned to provide recommendations for things like indexes, deprecated features, enabled or missing trace flags, statistics, etc. Recommendations are surfaced to the [SQL VM management page](manage-sql-vm-portal.md) of the [Azure portal](https://portal.azure.com/#blade/HubsExtension/BrowseResource/resourceType/Microsoft.SqlVirtualMachine%2FSqlVirtualMachines).
backup Backup Azure Sap Hana Database Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-sap-hana-database-troubleshoot.md
Title: Troubleshoot SAP HANA databases backup errors description: Describes how to troubleshoot common errors that might occur when you use Azure Backup to back up SAP HANA databases. Previously updated : 05/31/2021 Last updated : 02/01/2022+++ # Troubleshoot backup of SAP HANA databases on Azure
-This article provides troubleshooting information for backing up SAP HANA databases on Azure virtual machines. For more information on the SAP HANA backup scenarios we currently support, see [Scenario support](sap-hana-backup-support-matrix.md#scenario-support).
+This article provides troubleshooting information to back up SAP HANA databases on Azure virtual machines. For more information on the SAP HANA backup scenarios we currently support, see [Scenario support](sap-hana-backup-support-matrix.md#scenario-support).
## Prerequisites and Permissions
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
| **Error message** | `Azure Backup does not have required role privileges to carry out Backup and Restore operations` | | - | |
-| **Possible causes** | All operations will fail with this error when the Backup user (AZUREWLBACKUPHANAUSER) doesn't have the **SAP_INTERNAL_HANA_SUPPORT** role assigned or the role may have been overwritten. |
+| **Possible causes** | All operations fail with this error when the Backup user (AZUREWLBACKUPHANAUSER) doesn't have the **SAP_INTERNAL_HANA_SUPPORT** role assigned or the role might have been overwritten. |
| **Recommended action** | Download and run the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance, or manually assign the **SAP_INTERNAL_HANA_SUPPORT** role to the Backup user (AZUREWLBACKUPHANAUSER).<br><br>**Note**<br><br>If you are using HANA 2.0 SPS04 Rev 46 and later, this error doesn't occur as the use of the **SAP_INTERNAL_HANA_SUPPORT** role is deprecated in these HANA versions. | ### UserErrorInOpeningHanaOdbcConnection | **Error message** | `Failed to connect to HANA system` | | | |
-| **Possible causes** | <ul><li>Connection to HANA instance failed</li><li>System DB is offline</li><li>Tenant DB is offline</li><li>Backup user (AZUREWLBACKUPHANAUSER) doesn't have enough permissions/privileges.</li></ul> |
-| **Recommended action** | Check if the system is up and running. If the database(s) is up and running, ensure that the required permissions are set by downloading and running the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance. |
+| **Possible causes** | <ul><li>Connection to HANA instance failed</li><li>System DB is offline</li><li>Tenant DB is offline</li><li>Backup user (AZUREWLBACKUPHANAUSER) doesn't have enough permissions/privileges.</li></ul> |
+| **Recommended action** | Check if the system is running. If one or more databases is running, ensure that the required permissions are set. To do so, download and run the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance. |
### UserErrorHanaInstanceNameInvalid | **Error message** | `The specified SAP HANA instance is either invalid or can't be found` | | | | | **Possible causes** | <ul><li>The specified SAP HANA instance is either invalid or can't be found.</li><li>Multiple SAP HANA instances on a single Azure VM can't be backed up.</li></ul> |
-| **Recommended action** | <ul><li>Ensure that only one HANA instance is running on the Azure VM.</li><li>Run the script from the Discover DB pane (you can also find this [here](https://aka.ms/scriptforpermsonhana)) with the correct SAP HANA instance to resolve the issue.</li></ul> |
+| **Recommended action** | <ul><li>Ensure that only one HANA instance is running on the Azure VM.</li><li> To resolve the issue, run the script from the _Discover DB_ pane (you can also find the script [here](https://aka.ms/scriptforpermsonhana)) with the correct SAP HANA instance.</li></ul> |
### UserErrorHANALSNValidationFailure | **Error message** | `Backup log chain is broken` | | | |
-| **Possible causes** | HANA LSN Log chain break can be triggered for various reasons, including:<ul><li>Azure Storage call failure to commit backup.</li><li>The Tenant DB is offline.</li><li>Extension upgrade has terminated an in-progress Backup job.</li><li>Unable to connect to Azure Storage during backup.</li><li>SAP HANA has rolled back a transaction in the backup process.</li><li>A backup is complete, but catalog is not yet updated with success in HANA system.</li><li>Backup failed from Azure Backup perspective, but success from HANA's perspective - the log backup/catalog destination may have been updated from backint to file system, or the backint executable may have been changed.</li></ul> |
-| **Recommended action** | To resolve this issue, Azure Backup triggers an auto-heal Full backup. While this auto-heal backup is in progress, all log backups are triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**. Once the auto-heal Full backup is complete, logs and all other backups will start working as expected.<br>If you do not see an auto-heal full backup triggered or any successful backup (Full/Differential/ Incremental) in 24 hours, contact Microsoft support.</br> |
+| **Possible causes** | HANA LSN Log chain break can be triggered for various reasons, including:<ul><li>Azure Storage call failure to commit backup.</li><li>The Tenant DB is offline.</li><li>Extension upgrade has terminated an in-progress Backup job.</li><li>Unable to connect to Azure Storage during backup.</li><li>SAP HANA has rolled back a transaction in the backup process.</li><li>A backup is complete, but catalog is not yet updated with success in HANA system.</li><li>Backup failed from Azure Backup perspective, but success from the perspective of HANA ΓÇö the log backup/catalog destination might have been updated from backint-to-file system, or the backint executable might have been changed.</li></ul> |
+| **Recommended action** | To resolve this issue, Azure Backup triggers an auto-heal Full backup. While this auto-heal backup is in progress, all log backups are triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**. Once the auto-heal Full backup is complete, logs and all other backups start working as expected.<br>If you do not see an auto-heal full backup triggered or any successful backup (Full/Differential/ Incremental) in 24 hours, contact Microsoft support.</br> |
### UserErrorSDCtoMDCUpgradeDetected | **Error message** | `SDC to MDC upgrade detected.` | | | |
-| **Possible causes** | When an SDC system is upgraded to MDC, backups fail with this error. |
+| **Possible causes** | When an SDC system is upgraded to MDC, backups fail with this error. |
| **Recommended action** | To troubleshoot and resolve the issue, see [SDC to MDC upgrade](#sdc-to-mdc-upgrade-with-a-change-in-sid). | ### UserErrorInvalidBackintConfiguration
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
| **Error message** | `Backups will fail with this error when the Backint Configuration is incorrectly updated.` | | | | | **Possible causes** | The backint configuration updated during the Configure Protection flow by Azure Backup is either altered/updated by the customer. |
-| **Recommended action** | Check if the following (backint) parameters are set:<br><ul><li>[catalog_backup_using_backint:true]</li><li>[enable_accumulated_catalog_backup:false]</li><li>[parallel_data_backup_backint_channels:1]</li><li>[log_backup_timeout_s:900)]</li><li>[backint_response_timeout:7200]</li></ul>If backint-based parameters are present at the HOST level, remove them. However, if parameters aren't present at the HOST level but have been manually modified at a database level, ensure that the database level values are set above. Or, run [stop protection with retain backup data](./sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) from the Azure portal, and then select Resume backup. |
+| **Recommended action** | Check if the following (backint) parameters are set:<br><ul><li>[catalog_backup_using_backint:true]</li><li>[enable_accumulated_catalog_backup:false]</li><li>[parallel_data_backup_backint_channels:1]</li><li>[log_backup_timeout_s:900)]</li><li>[backint_response_timeout:7200]</li></ul>If backint-based parameters are present at the HOST level, remove them. However, if the parameters aren't present at the HOST level, but are manually modified at a database level, ensure that the database level values are set. Or, run [stop protection with retain backup data](./sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) from the Azure portal, and then select Resume backup. |
### UserErrorIncompatibleSrcTargetSystemsForRestore |**Error message** | `The source and target systems for restore are incompatible.` | ||| |**Possible causes** | The restore flow fails with this error when the source and target HANA databases, and systems are incompatible. |
-|Recommended action | Ensure that your restore scenario isn't in the following list of possible incompatible restores:<br> **Case 1:** SYSTEMDB cannot be renamed during restore.<br>**Case 2:** Source - SDC and target - MDC: The source database cannot be restored as SYSTEMDB or tenant DB on the target. <br> **Case 3:** Source - MDC and target - SDC: The source database (SYSTEMDB or tenant DB) cannot be restored to the target.<br>To learn more, see the note **1642148** in the [SAP support launchpad](https://launchpad.support.sap.com). |
+|Recommended action | Ensure that your restore scenario isn't in the following list of possible incompatible restores:<br> **Case 1:** SYSTEMDB cannot be renamed during restore.<br>**Case 2:** Source ΓÇö SDC and target ΓÇö MDC: The source database cannot be restored as SYSTEMDB or tenant DB on the target. <br> **Case 3:** Source ΓÇö MDC and target ΓÇö SDC: The source database (SYSTEMDB or tenant DB) cannot be restored to the target.<br>To learn more, see the note **1642148** in the [SAP support launchpad](https://launchpad.support.sap.com). |
### UserErrorHANAPODoesNotExist **Error message** | `Database configured for backup does not exist.` | --
-**Possible causes** | If a database that has been configured for backup is deleted, then all scheduled and ad-hoc backups on this database will fail.
+**Possible causes** | If you delete a database that is configured for backup, all scheduled and on-demand backups on this database will fail.
**Recommended action** | Verify if the database is deleted. Re-create the database or [stop protection](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) (with or without retain data) for the database. ### UserErrorInsufficientPrivilegeOfDatabaseUser **Error message** | `Azure Backup does not have enough privileges to carry out Backup and Restore operations.` - |
-**Possible causes** | Backup user (AZUREWLBACKUPHANAUSER) created by the pre-registration script doesn't have one or more of the following roles assigned:<ul><li>For MDC, DATABASE ADMIN and BACKUP ADMIN (for HANA 2.0 SPS05 and later) to create new databases during restore.</li><li>For SDC, BACKUP ADMIN to create new databases during restore.</li><li>CATALOG READ to read the backup catalog.</li><li>SAP_INTERNAL_HANA_SUPPORT to access a few private tables. Only required for SDC and MDC versions prior to HANA 2.0 SPS04 Rev 46. This is not required for HANA 2.0 SPS04 Rev 46 and later as we are getting the required information from public tables now with the fix from HANA team.</li></ul>
-**Recommended action** | To resolve the issue, add the required roles and permissions manually to the Backup user (AZUREWLBACKUPHANAUSER), or download and run the pre-registration script on the [SAP HANA instance](https://aka.ms/scriptforpermsonhana).
+**Possible causes** | Backup user (AZUREWLBACKUPHANAUSER) created by the pre-registration script doesn't have one or more of the following roles assigned:<ul><li>For MDC, DATABASE ADMIN and BACKUP ADMIN (for HANA 2.0 SPS05 and later) create new databases during restore.</li><li>For SDC, BACKUP ADMIN creates new databases during restore.</li><li>CATALOG READ to read the backup catalog.</li><li>SAP_INTERNAL_HANA_SUPPORT to access a few private tables. Only required for SDC and MDC versions prior to HANA 2.0 SPS04 Rev 46. It's not required for HANA 2.0 SPS04 Rev 46 and later. This is because we are getting the required information from public tables now with the fix from HANA team.</li></ul>
+**Recommended action** | To resolve the issue, add the required roles and permissions manually to the Backup user (AZUREWLBACKUPHANAUSER). Or, you can download and run the pre-registration script on the [SAP HANA instance](https://aka.ms/scriptforpermsonhana).
### UserErrorDatabaseUserPasswordExpired
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
**Error message** | `Remedial Backup in progress.` - | -
-**Possible causes** | Azure Backup triggers a remedial full backup to handle LSN log chain break. While this remedial full is in progress, backups (Full/ Differential/Incremental) triggered through the portal/CLI fails with this error.
-**Recommended action** | Wait for the remedial full backup to complete successfully before triggering another backup.
+**Possible causes** | Azure Backup triggers a remedial full backup to handle LSN log chain break. While the remedial full is in progress, backups (Full/ Differential/Incremental) triggered through the portal/CLI fails with this error.
+**Recommended action** | Wait for the remedial full backup to complete successfully before you trigger another backup.
### OperationCancelledBecauseConflictingOperationRunningUserError **Error message** | `Conflicting operation in progress.` -- | - **Possible causes** | A Full/Differential/Incremental backup triggered through portal/CLI/native HANA clients, while another Full/Differential/Incremental backup is already in progress.
-**Recommended action** | Wait for the active backup job to complete before triggering a new Full/delta backup.
+**Recommended action** | Wait for the active backup job to complete before you trigger a new Full/delta backup.
### OperationCancelledBecauseConflictingAutohealOperationRunning UserError **Error message** | `Auto-heal Full backup in progress.` - | -
-**Possible causes** | Azure Backup triggers an auto-heal Full backup to resolve **UserErrorHANALSNValidationFailure**. While this auto-heal backup is in progress, all the log backups triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**.<br>Once the auto-heal Full backup is complete, logs and all other backups will start working as expected.</br>
-**Recommended action** | Wait for the auto-heal Full backup to complete before triggering a new Full/delta backup.
+**Possible causes** | Azure Backup triggers an auto-heal Full backup to resolve **UserErrorHANALSNValidationFailure**. While this auto-heal backup is in progress, all the log backups triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**.<br>Once the auto-heal Full backup is complete, logs and all other backups start working as expected.</br>
+**Recommended action** | Wait for the auto-heal Full backup to complete before you trigger a new Full/delta backup.
### UserErrorHanaPreScriptNotRun **Error message** | `Pre-registration script not run.` | --
-**Possible causes** | The SAP HANA pre-registration script for setting up the environment has not been run.
+**Possible causes** | The SAP HANA pre-registration script to set up the environment has not been run.
**Recommended action** | Download and run the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance.
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
**Error message** | `RecoverySys.py could not be run successfully to restore System DB.` -- |
-**Possible causes** | Possible causes for System DB restore to fail are:<ul><li>Azure Backup is unable to find **Recoverysys.py** on the HANA machine. This happens when the HANA environment isn't set up properly.</li><li>**Recoverysys.py** is present, but triggering this script has failed to invoke HANA to perform the restore.</li><li>Recoverysys.py has successfully invoked HANA to perform the restore, but HANA fails to restore.</li></ul>
-**Recommended action** | <ul><li>For issue 1, work with the SAP HANA team to fix the issue.</li><li>For 2 and 3, see the log trace by running the HDSetting.sh command in sid-adm prompt. For example, _/usr/sap/SID/HDB00/HDBSetting.sh_.</li></ul>Share these findings with the SAP HANA team to get the issue fixed.
+**Possible causes** | Possible causes for System DB restore to fail are:<ul><li>Azure Backup is unable to find **Recoverysys.py** on the HANA machine. It happens when the HANA environment isn't set up properly.</li><li>**Recoverysys.py** is present, but when you trigger this script, it fails to invoke HANA to perform the restore.</li><li>Recoverysys.py has successfully invoked HANA to perform the restore, but HANA fails to restore.</li></ul>
+**Recommended action** | <ul><li>For issue 1, work with the SAP HANA team to fix the issue.</li><li>For 2 and 3, run the HDSetting.sh command in sid-adm prompt and see the log trace. For example, _/usr/sap/SID/HDB00/HDBSetting.sh_.</li></ul>Share these findings with the SAP HANA team to get the issue fixed.
### UserErrorDBNameNotInCorrectFormat
Assume an SDC HANA instance "H21" is backed up. The backup items page will show
Note the following points: -- By default, the restored db name will be populated with the backup item name. In this case, h21(sdc).-- Selecting the target as H11 won't change the restored db name automatically. **It should be edited to h11(sdc)**. Regarding SDC, the restored db name will be the target instance ID with lowercase letters and 'sdc' appended in brackets.
+- By default, the restored database name will be populated with the backup item name. In this case, h21(sdc).
+- Select the target as H11 won't change the restored database name automatically. **It should be edited to h11(sdc)**. Regarding SDC, the restored db name will be the target instance ID with lowercase letters and 'sdc' appended in brackets.
- Since SDC can have only single database, you also need to select the checkbox to allow override of the existing database data with the recovery point data. - Linux is case-sensitive. So be careful to preserve the case. ### Multiple Container Database (MDC) restore
-In multiple container databases for HANA, the standard configuration is SYSTEMDB + 1 or more Tenant DBs. Restoring an entire SAP HANA instance means to restore both SYSTEMDB and Tenant DBs. One restores SYSTEMDB first and then proceeds for Tenant DB. System DB essentially means to override the system information on the selected target. This restore also overrides the BackInt related information in the target instance. So after the system DB is restored to a target instance, run the pre-registration script again. Only then the subsequent tenant DB restores will succeed.
+In multiple container databases for HANA, the standard configuration is SYSTEMDB + 1 or more Tenant DBs. Restore of an entire SAP HANA instance restores both SYSTEMDB and Tenant DBs. One restores SYSTEMDB first and then proceeds for Tenant DB. System DB essentially means to override the system information on the selected target. This restore also overrides the BackInt related information in the target instance. So after the system DB is restored to a target instance, run the pre-registration script again. Only then the subsequent tenant DB restores will succeed.
## Back up a replicated VM ### Scenario 1
-The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built to simulate the old VM. That is, the settings are exactly the same. (This is because the original VM was deleted and the restore was done from VM backup or Azure Site Recovery).
+The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built to simulate the old VM. That is, the settings are exactly the same. (It's because the original VM was deleted and the restore was done from VM backup or Azure Site Recovery).
This scenario could include two possible cases. Learn how to back up the replicated VM in both of these cases:
This scenario could include two possible cases. Learn how to back up the replica
- a different name than the deleted VM - the same name as the deleted VM but is in a different resource group or subscription (as compared to the deleted VM)
- If this is the case, then do the following steps:
+ If so, then follow these steps:
- The extension is already present on the VM, but isn't visible to any of the services - Run the pre-registration script
- - If you discover and protect the new databases, you'll start seeing duplicate active databases in the portal. To avoid this, [Stop protection with retain data](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) for the old databases. Then continue with the remaining steps.
- - Discover the databases to enable backup
+ - If you discover and protect the new databases, you start seeing duplicate active databases in the portal. To avoid this, [Stop protection with retain data](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) for the old databases. Then continue with the remaining steps.
+ - Discover the databases
- Enable backups on these databases
- - The already existing backed up databases (from the deleted VM) will continue to be stored in the vault (with their backups being retained according to the policy)
+ - The already existing backed-up databases (from the deleted VM) continue to be stored in the vault. They're stored with their backups being retained according to the policy.
### Scenario 2
-The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built out of the content ΓÇô to be used as a template. This is a new VM with a new SID.
+The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built out of the content ΓÇö to be used as a template. The VM is new with a new SID.
-Follow these steps to enable backups on the new VM:
+Follow these steps to enable backups on the new VM:
- The extension is already present on the VM, but not visible to any of the services - Run the pre-registration script. Based on the SID of the new VM, two scenarios can arise:
- - The original VM and the new VM have the same SID. The pre-registration script will run successfully.
- - The original VM and the new VM have different SIDs. The pre-registration script will fail. Contact support to get help in this scenario.
+ - The original VM and the new VM have the same SID. The pre-registration script runs successfully.
+ - The original VM and the new VM have different SIDs. The pre-registration script fails. Contact support to get help in this scenario.
- Discover the databases that you want to back up - Enable backups on these databases
Upgrades to the OS, SDC version change, or MDC version change that don't cause a
- Ensure that the new OS version, SDC, or MDC version are currently [supported by Azure Backup](sap-hana-backup-support-matrix.md#scenario-support) - [Stop protection with retain data](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) for the database - Perform the upgrade or update-- Rerun the pre-registration script. Often, the upgrade process may remove [the necessary roles](tutorial-backup-sap-hana-db.md#what-the-pre-registration-script-does). Running the pre-registration script will help verify all the required roles.
+- Rerun the pre-registration script. Often, the upgrade process might remove [the necessary roles](tutorial-backup-sap-hana-db.md#what-the-pre-registration-script-does). Run the pre-registration script to verify all the required roles.
- Resume protection for the database again ## SDC to MDC upgrade with no change in SID
Upgrades from SDC to MDC that don't cause a SID change can be handled as follows
- Rerun the [pre-registration script](https://aka.ms/scriptforpermsonhana) - Re-register the extension for the same machine in the Azure portal (**Backup** -> **View details** -> Select the relevant Azure VM -> Re-register) - Select **Rediscover DBs** for the same VM. This action should show the new DBs in step 3 as SYSTEMDB and Tenant DB, not SDC-- The older SDC database will continue to exist in the vault and have the old backed-up data retained according to the policy
+- The older SDC database continues to exist in the vault and have the old backed-up data retained according to the policy
- Configure backup for these databases ## SDC to MDC upgrade with a change in SID
Upgrades from SDC to MDC that cause a SID change can be handled as follows:
- Ensure that the new MDC version is currently [supported by Azure Backup](sap-hana-backup-support-matrix.md#scenario-support) - **Stop protection with retain data** for the old SDC database
+- Move the _config.json_ file located at _/opt/msawb/etc/config/SAPHana/_.
- Perform the upgrade. After completion, the HANA system is now MDC with a system DB and tenant DBs-- Rerun the [pre-registration script](https://aka.ms/scriptforpermsonhana) with correct details (new SID and MDC). Due to a change in SID, you may face issues with successfully running the script. Contact Azure Backup support if you face issues.
+- Rerun the [pre-registration script](https://aka.ms/scriptforpermsonhana) with correct details (new SID and MDC). Due to a change in SID, you might face issues with successful execution of the script. Contact Azure Backup support if you face issues.
- Re-register the extension for the same machine in the Azure portal (**Backup** -> **View details** -> Select the relevant Azure VM -> Re-register) - Select **Rediscover DBs** for the same VM. This action should show the new DBs in step 3 as SYSTEMDB and Tenant DB, not SDC-- The older SDC database will continue to exist in the vault and have old backed up data retained according to the policy
+- The older SDC database continues to exist in the vault and have old backed up data retained according to the policy
- Configure backup for these databases ## Re-registration failures
Check for one or more of the following symptoms before you trigger the re-regist
- The VM is shut down, so backups can't take place - Network issues
-These symptoms may arise for one or more of the following reasons:
+These symptoms might arise for one or more of the following reasons:
- An extension was deleted or uninstalled from the portal. - The VM was restored back in time through in-place disk restore. - The VM was shut down for an extended period, so the extension configuration on it expired.-- The VM was deleted, and another VM was created with the same name and in the same resource group as the deleted VM.
+- The VM was deleted. Also, the other VM was created with the same name and in the same resource group as the deleted VM.
In the preceding scenarios, we recommend that you trigger a re-register operation on the VM. ## Next steps -- Review the [frequently asked questions](./sap-hana-faq-backup-azure-vm.yml) about backing up SAP HANA databases on Azure VMs.
+- Review the [frequently asked questions](./sap-hana-faq-backup-azure-vm.yml) about the back up of SAP HANA databases on Azure VMs.
bastion Connect Native Client Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/bastion/connect-native-client-windows.md
Title: 'Connect to a VM using the native Windows client and Azure Bastion'
+ Title: 'Connect to a VM using the native client and Azure Bastion'
-description: Learn how to connect to a VM from a Windows computer by using Bastion and the native Windows client.
+description: Learn how to connect to a VM from a Windows computer by using Bastion and the native client.
Previously updated : 12/01/2021 Last updated : 01/31/2022 # Connect to a VM using Bastion and the native client on your workstation (Preview)
-Azure Bastion now offers support for connecting to target VMs in Azure using a native RDP or SSH client on your local workstation. This feature lets you connect to your target VMs via Bastion using Azure CLI and expands your sign-in options to include local SSH key pair and Azure Active Directory (Azure AD). This article helps you configure Bastion with the required settings, and then connect to a VM in the VNet. For more information, see the [What is Azure Bastion?](bastion-overview.md).
+Azure Bastion offers support for connecting to target VMs in Azure using a native RDP or SSH client on your local workstation. This feature lets you connect to your target VMs via Bastion using Azure CLI and expands your sign-in options to include local SSH key pair, and Azure Active Directory (Azure AD). This article helps you configure Bastion with the required settings, and then connect to a VM in the VNet. For more information, see the [What is Azure Bastion?](bastion-overview.md).
> [!NOTE] > This configuration requires the Standard SKU for Azure Bastion.
Azure Bastion now offers support for connecting to target VMs in Azure using a n
Currently, this feature has the following limitations:
-* Signing in using an SSH private key stored in Azure Key Vault is not supported with this feature. Download your private key to a file on your local machine before signing in to your Linux VM using an SSH key pair.
+* Signing in using an SSH private key stored in Azure Key Vault isnΓÇÖt supported with this feature. Download your private key to a file on your local machine before signing in to your Linux VM using an SSH key pair.
## <a name="prereq"></a>Prerequisites
-Before you begin, verify that you have met the following criteria:
+Before you begin, verify that youΓÇÖve met the following criteria:
* The latest version of the CLI commands (version 2.32 or later) is installed. For information about installing the CLI commands, see [Install the Azure CLI](/cli/azure/install-azure-cli) and [Get Started with Azure CLI](/cli/azure/get-started-with-azure-cli). * An Azure virtual network. * A virtual machine in the virtual network. * If you plan to sign in to your virtual machine using your Azure AD credentials, make sure your virtual machine is set up using one of the following methods:
- * Enable Azure AD login for a [Windows VM](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Linux VM](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+ * Enable Azure AD sign-in for a [Windows VM](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Linux VM](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
* [Configure your Windows VM to be Azure AD-joined](../active-directory/devices/concept-azure-ad-join.md). * [Configure your Windows VM to be hybrid Azure AD-joined](../active-directory/devices/concept-azure-ad-join-hybrid.md).
Verify that the following roles and ports are configured in order to connect.
* Reader role on the virtual machine. * Reader role on the NIC with private IP of the virtual machine. * Reader role on the Azure Bastion resource.
-* Virtual Machine Administrator Login or Virtual Machine User Login role, if you are using the Azure AD sign-in method. You only need to do this if you're enabling Azure AD login using the process described in this article: [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+* Virtual Machine Administrator Login or Virtual Machine User Login role, if youΓÇÖre using the Azure AD sign-in method. You only need to do this if you're enabling Azure AD login using the process described in this article: [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
### Ports To connect to a Linux VM using native client support, you must have the following ports open on your Linux VM: * Inbound port: SSH (22) *or*
-* Inbound port: Custom value (you will then need to specify this custom port when you connect to the VM via Azure Bastion)
+* Inbound port: Custom value (youΓÇÖll then need to specify this custom port when you connect to the VM via Azure Bastion)
To connect to a Windows VM using native client support, you must have the following ports open on your Windows VM: * Inbound port: RDP (3389) *or*
-* Inbound port: Custom value (you will then need to specify this custom port when you connect to the VM via Azure Bastion)
+* Inbound port: Custom value (youΓÇÖll then need to specify this custom port when you connect to the VM via Azure Bastion)
## <a name="connect"></a>Connect to a VM from a Windows local workstation
This section helps you connect to your virtual machine from a Windows local work
> If you want to specify a custom port value, you should also include the field **--resource-port** in the sign-in command. >
- * If you are signing in to an Azure AD login-enabled VM, use the following command. To learn more about how to use Azure AD to sign in to your Azure Linux VMs, see [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+ * If youΓÇÖre signing in to an Azure AD login-enabled VM, use the following command. For more information, see [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
```azurecli-interactive az network bastion ssh --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --auth-type "AAD" ```
- * If you are signing in using an SSH key pair, use the following command.
+ * If youΓÇÖre signing in using an SSH key pair, use the following command.
```azurecli-interactive az network bastion ssh --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --auth-type "ssh-key" --username "<Username>" --ssh-key "<Filepath>" ```
- * If you are signing in using a local username and password, use the following command. You will then be prompted for the password for the target VM.
+ * If youΓÇÖre signing in using a local username and password, use the following command. YouΓÇÖll then be prompted for the password for the target VM.
```azurecli-interactive az network bastion ssh --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --auth-type "password" --username "<Username>" ``` > [!NOTE]
- > VM sessions using the **az network bastion ssh** command do not support file transfer. To use file transfer with SSH over Bastion, please see the section on the **az network bastion tunnel** command further below.
+ > VM sessions using the **az network bastion ssh** command do not support file transfer. To use file transfer with SSH over Bastion, see the [az network bastion tunnel](#connect-tunnel) section.
> ### <a name="connect-windows"></a>Connect to a Windows VM
This section helps you connect to your virtual machine from a Windows local work
> If you want to specify a custom port value, you should also include the field **--resource-port** in the sign-in command. >
- * To connect via RDP, use the following command. You will then be prompted to input your credentials. You can use either a local username and password or your Azure AD credentials. To learn more about how to use Azure AD to sign in to your Azure Windows VMs, see [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md).
+ * To connect via RDP, use the following command. YouΓÇÖll then be prompted to input your credentials. You can use either a local username and password or your Azure AD credentials. For more information, see [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md).
```azurecli-interactive az network bastion rdp --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>"
This section helps you connect to your virtual machine from a Windows local work
## <a name="connect-tunnel"></a>Connect to a VM using the *az network bastion tunnel* command This section helps you connect to your virtual machine using the *az network bastion tunnel* command, which allows you to:
-* Use native clients on *non*-Windows local workstations (ex: a Linux PC)
-* Use a native client of your choice
-* Set up concurrent VM sessions with Bastion
-* Access file transfer for SSH sessions
+* Use native clients on *non*-Windows local workstations (ex: a Linux PC).
+* Use a native client of your choice.
+* Set up concurrent VM sessions with Bastion.
+* Access file transfer for SSH sessions.
1. Sign in to your Azure account and select your subscription containing your Bastion resource.
This section helps you connect to your virtual machine using the *az network bas
```azurecli-interactive az network bastion tunnel --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --resource-port "<TargetVMPort>" --port "<LocalMachinePort>" ```
-3. Connect and log in to your target VM using SSH or RDP, the native client of your choice, and the local machine port you specified in Step 2.
+3. Connect and sign in to your target VM using SSH or RDP, the native client of your choice, and the local machine port you specified in Step 2.
## Next steps
bastion Vm Upload Download Native https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/bastion/vm-upload-download-native.md
+
+ Title: 'Upload and download files - native client'
+
+description: Learn how to upload and download files using Azure Bastion and a native client.
+++++ Last updated : 01/31/2022+
+# Customer intent: I want to upload or download files using Bastion.
+++
+# Upload and download files using the native client: Azure Bastion (Preview)
+
+Azure Bastion offers support for file transfer between your target VM and local computer using Bastion and a native RDP or SSH client. To learn more about native client support, refer to [Connect to a VM using the native client](connect-native-client-windows.md). You can use either SSH or RDP to upload files to a VM from your local computer. To download files from a VM, you must use RDP.
+
+> [!NOTE]
+> Uploading and downloading files is supported using the native client only. You can't upload and download files using PowerShell or via the Azure portal.
+>
+
+## Upload and download files - RDP
+
+This section helps you transfer files between your local Windows computer and your target VM over RDP. Once connected to the target VM, you can transfer files using right-click, then **Copy** and **Paste**.
+
+1. Sign in to your Azure account and select your subscription containing your Bastion resource.
+
+ ```azurecli-interactive
+ az login
+ az account list
+ az account set --subscription "<subscription ID>"
+ ```
+
+1. Sign in to your target VM via RDP using the following command. You can use either a local username and password, or your Azure AD credentials. To learn more about how to use Azure AD to sign in to your Azure Windows VMs, see [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md).
+
+ ```azurecli-interactive
+ az network bastion rdp --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>"
+ ```
+
+1. Once you sign in to your target VM, the native client on your computer will open up with your VM session. You can now transfer files between your VM and local machine using right-click, then **Copy** and **Paste**.
+
+## Upload files - SSH
+
+This section helps you upload files from your local computer to your target VM over SSH using the *az network bastion tunnel* command. To learn more about the tunnel command, refer to [Connect to a VM using the *az network bastion tunnel* command](connect-native-client-windows.md#connect-tunnel).
+
+> [!NOTE]
+> File download over SSH is not currently supported.
+>
+
+1. Sign in to your Azure account and select your subscription containing your Bastion resource.
+
+ ```azurecli-interactive
+ az login
+ az account list
+ az account set --subscription "<subscription ID>"
+ ```
+
+1. Open the tunnel to your target VM using the following command:
+
+ ```azurecli-interactive
+ az network bastion tunnel --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --resource-port "<TargetVMPort>" --port "<LocalMachinePort>"
+ ```
+
+1. Upload files to your local machine to your target VM using the following command:
+
+ ```azurecli-interactive
+ scp -P <LocalMachinePort> <local machine file path> <username>@127.0.0.1:<target VM file path>
+ ```
+
+1. Connect to your target VM using SSH, the native client of your choice, and the local machine port you specified in Step 3. For example, you can use the following command if you have the OpenSSH client installed on your local computer:
+
+ ```azurecli-interactive
+ ssh <username>@127.0.0.1 -p <LocalMachinePort>
+ ```
+
+## Next steps
+
+- Read the [Bastion FAQ](bastion-faq.md)
cognitive-services Bing Autosuggest Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/bing-autosuggest-upgrade-guide-v5-to-v7.md
Last updated 02/20/2019
# Autosuggest API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Autosuggest API. Use this guide to help update your application to use version 7.
Blocked|InvalidRequest.Blocked
## Next steps > [!div class="nextstepaction"]
-> [Use and display requirements](../bing-web-search/use-display-requirements.md)
+> [Use and display requirements](../bing-web-search/use-display-requirements.md)
cognitive-services Get Suggestions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/concepts/get-suggestions.md
# Suggesting query terms
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
+ Typically, you'd call the Bing Autosuggest API each time a user types a new character in your application's search box. The completeness of the query string impacts the relevance of the suggested query terms that the API returns. The more complete the query string, the more relevant the list of suggested query terms are. For example, the suggestions that the API may return for `s` are likely to be less relevant than the queries it returns for `sailing dinghies`.
If the user selects a suggested query from the drop-down list, you'd use the que
## Next steps
-* [What is the Bing Autosuggest API?](../get-suggested-search-terms.md)
+* [What is the Bing Autosuggest API?](../get-suggested-search-terms.md)
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/concepts/sending-requests.md
Last updated 06/27/2019
# Sending requests to the Bing Autosuggest API.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
If your application sends queries to any of the Bing Search APIs, you can use the Bing Autosuggest API to improve your users' search experience. The Bing Autosuggest API returns a list of suggested queries based on the partial query string in the search box. As characters are entered into a search box in your application, you can display suggestions in a drop-down list. Use this article to learn more about sending requests to this API.
BingAPIs-Market: en-US
- [What is Bing Autosuggest?](../get-suggested-search-terms.md) - [Bing Autosuggest API v7 reference](/rest/api/cognitiveservices-bingsearch/bing-autosuggest-api-v7-reference)-- [Getting suggested search terms from the Bing Autosuggest API](get-suggestions.md)
+- [Getting suggested search terms from the Bing Autosuggest API](get-suggestions.md)
cognitive-services Get Suggested Search Terms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/get-suggested-search-terms.md
Last updated 12/18/2019
# What is Bing Autosuggest?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
If your application sends queries to any of the Bing Search APIs, you can use the Bing Autosuggest API to improve your users' search experience. The Bing Autosuggest API returns a list of suggested queries based on the partial query string in the search box. As characters are entered into the search box, you can display suggestions in a drop-down list.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/language-support.md
Last updated 02/20/2019
# Language and region support for the Bing Autosuggest API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The following lists the languages supported by Bing Autosuggest API.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/client-libraries.md
# Quickstart: Use the Bing Autosuggest client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/csharp.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple C# application sends a partial search query to the API, and returns suggestions for searches. While this application is written in C#, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/dotnet/Search/BingAutosuggestv7.cs).
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/java.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Java application sends a partial search query to the API, and returns suggestions for searches. While this application is written in Java, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/java/Search/BingAutosuggestv7.java)
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/nodejs.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Node.js application sends a partial search query to the API, and returns suggestions for searches. While this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/nodejs/Search/BingAutosuggestv7.js)
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/php.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple PHP application sends a partial search query to the API, and returns suggestions for searches. While this application is written in PHP, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/python.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Python application sends a partial search query to the API, and returns suggestions for searches. While this application is written in Python, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/python/Search/BingAutosuggestv7.py)
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/ruby.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Ruby application sends a partial search query to the API, and returns suggestions for searches. While this application is written in Ruby, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Autosuggest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/tutorials/autosuggest.md
# Tutorial: Get search suggestions on a web page
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
In this tutorial, we'll build a Web page that allows users to query the Bing Autosuggest API.
cognitive-services Call Endpoint Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-csharp.md
# Quickstart: Call your Bing Custom Search endpoint using C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in C#, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/dotnet/Search/BingCustomSearchv7.cs).
cognitive-services Call Endpoint Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-java.md
# Quickstart: Call your Bing Custom Search endpoint using Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in Java, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/java/Search/BingCustomSearchv7.java).
cognitive-services Call Endpoint Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-nodejs.md
# Quickstart: Call your Bing Custom Search endpoint using Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in JavaScript, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/nodejs/Search/BingCustomSearchv7.js).
cognitive-services Call Endpoint Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-python.md
# Quickstart: Call your Bing Custom Search endpoint using Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in Python, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/python/Search/BingCustomSearchv7.py).
cognitive-services Define Custom Suggestions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/define-custom-suggestions.md
# Configure your custom autosuggest experience
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Custom Autosuggest returns a list of suggested search query strings that are relevant to your search experience. The suggested query strings are based on a partial query string that the user provides in the search box. The list will contain a maximum of 10 suggestions.
cognitive-services Define Your Custom View https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/define-your-custom-view.md
# Configure your Bing Custom Search experience
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
A Custom Search instance lets you tailor the search experience to include content only from websites that your users care about. Instead of performing a web-wide search, Bing searches only the slices of the web that interest you. To create your custom view of the web, use the Bing Custom Search [portal](https://www.customsearch.ai).
cognitive-services Endpoint Custom https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/endpoint-custom.md
# Custom Search
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Search enables you to create tailored search experiences for topics that you care about. Your users see search results tailored to the content they care about instead of having to page through search results that have irrelevant content. ## Custom Search Endpoint
cognitive-services Get Images From Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/get-images-from-instance.md
Last updated 09/10/2018
# Get images from your custom view
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Images Search lets you enrich your custom search experience with images. Similar to web results, custom search supports searching for images in your instance's list of websites. You can get the images using the Bing Custom Images Search API or through the Hosted UI feature. Using the Hosted UI feature is simple to use and recommended for getting your search experience up and running in short order. For information about configuring your Hosted UI to include images, see [Configure your hosted UI experience](hosted-ui.md).
cognitive-services Get Videos From Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/get-videos-from-instance.md
Last updated 09/10/2018
# Get videos from your custom view
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Videos Search lets you enrich your custom search experience with videos. Similar to web results, custom search supports searching for videos in your instance's list of websites. You can get the videos using the Bing Custom Videos Search API or through the Hosted UI feature. Using the Hosted UI feature is simple to use and recommended for getting your search experience up and running in short order. For information about configuring your Hosted UI to include videos, see [Configure your hosted UI experience](hosted-ui.md).
cognitive-services Hosted Ui https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/hosted-ui.md
# Configure your hosted UI experience
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Search provides a hosted UI that you can easily integrate into your webpages and web applications as a JavaScript code snippet. Using the Bing Custom Search portal, you can configure the layout, color, and search options of the UI.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/language-support.md
# Language and region support for the Bing Custom Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Custom Search API supports more than three dozen countries/regions, many with more than one language.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/overview.md
# What is the Bing Custom Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Custom Search API enables you to create tailored ad-free search experiences for topics that you care about. You can specify the domains and webpages for Bing to search, as well as pin, boost, or demote specific content to create a custom view of the web and help your users quickly find relevant search results.
cognitive-services Quick Start https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/quick-start.md
# Quickstart: Create your first Bing Custom Search instance
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
To use Bing Custom Search, you need to create a custom search instance that defines your view or slice of the web. This instance contains the public domains, websites, and webpages that you want to search, along with any ranking adjustments you may want.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Custom Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Search Your Custom View https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/search-your-custom-view.md
# Call your Bing Custom Search instance from the Portal
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
After you've configured your custom search experience, you can test it from within the Bing Custom Search [portal](https://customsearch.ai).
cognitive-services Share Your Custom Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/share-your-custom-search.md
# Share your Custom Search instance
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
You can easily allow collaborative editing and testing of your instance by sharing it with members of your team. You can share your instance with anyone using just their email address. To share an instance:
cognitive-services Custom Search Web Page https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/tutorials/custom-search-web-page.md
# Tutorial: Build a Custom Search web page
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Search enables you to create tailored search experiences for topics that you care about. For example, if you own a martial arts website that provides a search experience, you can specify the domains, sub-sites, and webpages that Bing searches. Your users see search results tailored to the content they care about instead of paging through general search results that may contain irrelevant content.
cognitive-services Search For Entities https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/concepts/search-for-entities.md
# Searching for entities with the Bing Entity API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
## Suggest search terms with the Bing Autosuggest API
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/concepts/sending-requests.md
# Sending search requests to the Bing Entity Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API sends a search query to Bing and gets results that include entities and places. Place results include restaurants, hotel, or other local businesses. For places, the query can specify the name of the local business or it can ask for a list (for example, restaurants near me). Entity results include persons, places, or things. Place in this context is tourist attractions, states, countries/regions, etc.
cognitive-services Entity Search Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/entity-search-endpoint.md
# Bing Entity Search API endpoint
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API has one endpoint that returns entities from the Web based on a query. These search results are returned in JSON.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/overview.md
Last updated 12/18/2019
# What is Bing Entity Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API sends a search query to Bing and gets results that include entities and places. Place results include restaurants, hotel, or other local businesses. Bing returns places if the query specifies the name of the local business or asks for a type of business (for example, restaurants near me). Bing returns entities if the query specifies well-known people, places (tourist attractions, states, countries/regions, etc.), or things.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Entity Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/csharp.md
# Quickstart: Send a search request to the Bing Entity Search REST API using C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple C# application sends a news search query to the API, and displays the response. The source code for this application is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/dotnet/Search/BingEntitySearchv7.cs).
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/java.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple Java application sends a news search query to the API, and displays the response.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/nodejs.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple JavaScript application sends a news search query to the API, and displays the response. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/nodejs/Search/BingEntitySearchv7.js).
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/php.md
# Quickstart: Send a search request to the Bing Entity Search REST API using PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple PHP application sends a news search query to the API, and displays the response.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/python.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple Python application sends a news search query to the API, and displays the response. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/python/Search/BingEntitySearchv7.py).
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/ruby.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple Ruby application sends a news search query to the API, and displays the response. The source code for this application is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/ruby/Search/BingEntitySearchv7.rb).
cognitive-services Rank Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/rank-results.md
# Using ranking to display entity search results
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Each entity search response includes a [RankingResponse](/rest/api/cognitiveservices/bing-web-api-v7-reference#rankingresponse) answer that specifies how you must display search results returned by the Bing Entity Search API. The ranking response groups results into pole, mainline, and sidebar content. The pole result is the most important or prominent result and should be displayed first. If you do not display the remaining results in a traditional mainline and sidebar format, you must provide the mainline content higher visibility than the sidebar content.
cognitive-services Tutorial Bing Entities Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/tutorial-bing-entities-search-single-page-app.md
# Tutorial: Single-page web app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API lets you search the Web for information about *entities* and *places.* You may request either kind of result, or both, in a given query. The definitions of places and entities are provided below.
cognitive-services Bing News Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/bing-news-upgrade-guide-v5-to-v7.md
Last updated 01/10/2019
# News Search API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing News Search API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Search For News https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/concepts/search-for-news.md
Last updated 12/18/2019
# Search for news with the Bing News Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Image Search API makes it easy to integrate Bing's cognitive news searching capabilities into your applications.
cognitive-services Send Search Queries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/concepts/send-search-queries.md
# Sending queries to the Bing News Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API enables you to search the web for relevant news items. Use this article to learn more about sending search queries to the API.
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/csharp.md
# Quickstart: Search for news using C# and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple C# application sends a news search query to the API, and displays the JSON response.
cognitive-services Endpoint News https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/endpoint-news.md
# Bing News Search API endpoints
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The **News Search API** returns news articles, Web pages, images, videos, and [entities](../bing-entities-search/overview.md). Entities contain summary information about a person, place, or topic.
cognitive-services Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/go.md
# Quickstart: Get news results using the Bing News Search REST API and Go
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This quickstart uses the Go language to call the Bing News Search API. The results include names and URLs of news sources identified by the query string.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/java.md
# Quickstart: Perform a news search using Java and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple Java application sends a news search query to the API, and displays the JSON response.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/language-support.md
# Language and region support for the Bing News Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API supports numerous countries/regions, many with more than one language. Specifying a country/region with a query serves primarily to refine search results based on interests in that country/region. Additionally, the results may contain links to Bing, and these links may localize the Bing user experience according to the specified country/region or language.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/nodejs.md
# Quickstart: Perform a news search using Node.js and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple JavaScript application sends a search query to the API and displays the JSON response.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/php.md
# Quickstart: Perform a news search using PHP and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple PHP application sends a search query to the API and displays the JSON response.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/python.md
# Quickstart: Perform a news search using Python and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple Python application sends a search query to the API and processes the JSON result.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing News Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/ruby.md
# Quickstart: Perform a news search using Ruby and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple Ruby application sends a search query to the API and processes the JSON response.
cognitive-services Search The Web https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/search-the-web.md
# What is the Bing News Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API makes it easy to integrate Bing's cognitive news searching capabilities into your applications. The API provides a similar experience to [Bing News](https://www.bing.com/news), letting you send search queries and receive relevant news articles.
cognitive-services Tutorial Bing News Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/tutorial-bing-news-search-single-page-app.md
# Tutorial: Create a single-page web app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API lets you search the Web and obtain results of the news type relevant to a search query. In this tutorial, we build a single-page Web application that uses the Bing News Search API to display search results on the page. The application includes HTML, CSS, and JavaScript components. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/Tutorials/BingNewsSearchApp.html).
cognitive-services Bing Spell Check Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/bing-spell-check-upgrade-guide-v5-to-v7.md
Last updated 02/20/2019
# Spell Check API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Spell Check API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/concepts/sending-requests.md
# Sending requests to the Bing Spell Check API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
To check a text string for spelling and grammar errors, you'd send a GET request to the following endpoint:
cognitive-services Using Spell Check https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/concepts/using-spell-check.md
# Using the Bing Spell Check API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this article to learn about using the Bing Spell Check API to perform contextual grammar and spell checking. While most spell-checkers rely on dictionary-based rule sets, the Bing spell-checker leverages machine learning and statistical machine translation to provide accurate and contextual corrections.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/language-support.md
# Language and region support for Bing Spell Check API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
These languages are supported by the Bing Spell Check API (only in `spell` mode).
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/overview.md
# What is the Bing Spell Check API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Spell Check API enables you to perform contextual grammar and spell checking on text. While most spell-checkers rely on dictionary-based rule sets, the Bing spell-checker leverages machine learning and statistical machine translation to provide accurate and contextual corrections.
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/csharp.md
# Quickstart: Check spelling with the Bing Spell Check REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple C# application sends a request to the API and returns a list of suggested corrections.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/java.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple Java application sends a request to the API and returns a list of suggested corrections.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/nodejs.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple JavaScript application sends a request to the API and returns a list of suggested corrections.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/php.md
# Quickstart: Check spelling with the Bing Spell Check REST API and PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple PHP application sends a request to the API and returns a list of suggested corrections.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/python.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple Python application sends a request to the API and returns a list of suggested corrections.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/ruby.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API using Ruby. This simple application sends a request to the API and returns a list of suggested corrections.
cognitive-services Sdk Quickstart Spell Check https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/sdk-quickstart-spell-check.md
# Quickstart: Check spelling with the Bing Spell Check SDK for C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to begin spell checking with the Bing Spell Check SDK for C#. While Bing Spell Check has a REST API compatible with most programming languages, the SDK provides an easy way to integrate the service into your applications. The source code for this sample can be found on [GitHub](https://github.com/Azure-Samples/cognitive-services-dotnet-sdk-samples/tree/master/samples/SpellCheck).
cognitive-services Spellcheck https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/tutorials/spellcheck.md
# Tutorial: Build a Web page Spell Check client
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
In this tutorial, we'll build a Web page that allows users to query the Bing Spell Check API. The source code for this application is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/Tutorials/BingSpellCheckApp.html).
cognitive-services Bing Video Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/bing-video-upgrade-guide-v5-to-v7.md
Last updated 01/31/2019
# Video Search API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Video Search API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Get Videos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/concepts/get-videos.md
# Search for videos with the Bing Video Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API makes it easy to integrate Bing's cognitive news searching capabilities into your applications. While the API primarily finds and returns relevant videos from the web, it provides several features for intelligent and focused video retrieval on the web.
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/concepts/sending-requests.md
# Sending search requests to the Bing Video Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This article describes the parameters and attributes of requests sent to the Bing Video Search API, as well as the JSON response object it returns.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/overview.md
Last updated 12/18/2019
# What is the Bing Video Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API makes it easy to add video searching capabilities to your services and applications. By sending user search queries with the API, you can get and display relevant and high-quality videos similar to [Bing Video](https://www.bing.com/video). Use this API for search results that only contain videos. The [Bing Web Search API](../bing-web-search/overview.md) can return other types of web content, including webpages, videos, news and images.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Video Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/csharp.md
# Quickstart: Search for videos using the Bing Video Search REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple C# application sends an HTTP video search query to the API and displays the JSON response. Although this application is written in C#, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/java.md
# Quickstart: Search for videos using the Bing Video Search REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple Java application sends an HTTP video search query to the API and displays the JSON response. Although this application is written in Java, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/nodejs.md
# Quickstart: Search for videos using the Bing Video Search REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple JavaScript application sends an HTTP video search query to the API, and displays the JSON response. Although this application is written in JavaScript and uses Node.js, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/php.md
# Quickstart: Search for videos using the Bing Video Search REST API and PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple PHP application sends an HTTP video search query to the API, and displays the JSON response. The example code is written to work under PHP 5.6.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/python.md
# Quickstart: Search for videos using the Bing Video Search REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple Python application sends an HTTP video search query to the API, and displays the JSON response. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/ruby.md
# Quickstart: Search for videos using the Bing Video Search REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple Ruby application sends an HTTP video search query to the API, and displays the JSON response. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Trending Videos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/trending-videos.md
Last updated 01/31/2019
# Get trending videos with the Bing Video Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API enables you to find today's trending videos from across the web, and in different categories.
cognitive-services Tutorial Bing Video Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/tutorial-bing-video-search-single-page-app.md
# Tutorial: Single-page Video Search app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API lets you search the Web and get video results relevant to a search query. In this tutorial, we build a single-page Web application that uses the Bing search API to display search results on the page. The application includes HTML, CSS, and JavaScript components. <!-- Remove until it can be replaced with a sanitized version.
cognitive-services Video Insights https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/video-insights.md
Last updated 01/31/2019
# Get insights about a video
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Each video returned by the Bing Video Search API includes a video ID that you can use to get more information about it, such as related videos. To get insights about a video, get its [videoId](/rest/api/cognitiveservices-bingsearch/bing-video-api-v7-reference#video-videoid) token in the API response.
cognitive-services Autosuggest Bing Search Terms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/autosuggest-bing-search-terms.md
# Autosuggest Bing search terms in your application
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
If you provide a search box where the user enters their search term, use the [Bing Autosuggest API](../bing-autosuggest/get-suggested-search-terms.md) to improve the experience. The API returns suggested query strings based on partial search terms as the user types.
cognitive-services Bing Api Comparison https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/bing-api-comparison.md
# What are the Bing Search APIs?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Search APIs let you build web-connected apps and services that find webpages, images, news, locations, and more without advertisements. By sending search requests using the Bing Search REST APIs or SDKs, you can get relevant information and content for web searches. Use this article to learn about the different Bing search APIs and how you can integrate cognitive searches into your applications and services. Pricing and rate limits may vary between APIs.
cognitive-services Bing Web Stats https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/bing-web-stats.md
# Add analytics to the Bing Search APIs
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Statistics provides analytics for the Bing Search APIs. These analytics include call volume, top query strings, geographic distribution, and more. You can enable Bing Statistics in the [Azure portal](https://ms.portal.azure.com) by navigating to your Azure resource and clicking **Enable Bing Statistics**.
cognitive-services Bing Web Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/bing-web-upgrade-guide-v5-to-v7.md
Last updated 02/12/2019
# Upgrade from Bing Web Search API v5 to v7
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Web Search API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Csharp Ranking Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/csharp-ranking-tutorial.md
# Build a console app search client in C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This tutorial shows how to build a simple .NET Core console app that allows users to query the Bing Web Search API and display ranked results.
cognitive-services Filter Answers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/filter-answers.md
Last updated 07/08/2019
# Filtering the answers that the search response includes
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
When you query the web, Bing returns all the relevant content it finds for the search. For example, if the search query is "sailing+dinghies", the response might contain the following answers:
cognitive-services Hit Highlighting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/hit-highlighting.md
Last updated 07/30/2019
# Using decoration markers to highlight text
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing supports hit highlighting, which marks query terms (or other terms that Bing finds relevant) in the display strings of some answers. For example, a webpage result's `name`, `displayUrl`, and `snippet` fields might contain marked query terms.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/language-support.md
# Language and region support for the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Web Search API supports over three dozen countries or regions, many with more than one language. Specifying a country or region with a query helps refine search results based on that country or regions interests. The results may include links to Bing, and these links may localize the Bing user experience according to the specified country/region or language.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/overview.md
# What is the Bing Web Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Web Search API is a RESTful service that provides instant answers to user queries. Search results are easily configured to include web pages, images, videos, news, translations, and more. Bing Web Search provides the results as JSON based on search relevance and your Bing Web Search subscriptions.
cognitive-services Paging Search Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/paging-search-results.md
# How to page through results from the Bing Search APIs
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
When you send a call to the Bing Web, Custom, Image, News or Video Search APIs, Bing returns a subset of the total number of results that may be relevant to the query. To get the estimated total number of available results, access the answer object's `totalEstimatedMatches` field.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/client-libraries.md
# Quickstart: Use a Bing Web Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/csharp.md
# Quickstart: Search the web using the Bing Web Search REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This C# application sends a search request to the API, and shows the JSON response. Although this application is written in C#, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/go.md
# Quickstart: Search the web using the Bing Web Search REST API and Go
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Go application sends a search request to the API, and shows the JSON response. Although this application is written in Go, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/java.md
# Quickstart: Use Java to search the web with the Bing Web Search REST API, an Azure cognitive service
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
In this quickstart, you'll use a Java application to make your first call to the Bing Web Search API. This Java application sends a search request to the API, and shows the JSON response. Although this application is written in Java, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/nodejs.md
# Quickstart: Search the web using the Bing Web Search REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Node.js application sends a search request to the API, and shows the JSON response. Although this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/php.md
# Quickstart: Use PHP to call the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Node.js application sends a search request to the API, and shows the JSON response. Although this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/python.md
# Quickstart: Use Python to call the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Python application sends a search request to the API, and shows the JSON response. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/ruby.md
# Quickstart: Use Ruby to call the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Ruby application sends a search request to the API, and shows the JSON response. Although this application is written in Ruby, the API is a RESTful Web service compatible with most programming languages.
Use this code to make a request and handle the response:
```ruby # Construct the endpoint uri.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
uri = URI(uri + path + "?q=" + URI.escape(term)) puts "Searching the Web for: " + term # Create the request.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
request = Net::HTTP::Get.new(uri) request['Ocp-Apim-Subscription-Key'] = accessKey # Get the response.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
response = Net::HTTP.start(uri.host, uri.port, :use_ssl => uri.scheme == 'https') do |http| http.request(request) end
cognitive-services Rank Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/rank-results.md
Last updated 03/17/2019
# How to use ranking to display Bing Web Search API results
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Each search response includes a [RankingResponse](/rest/api/cognitiveservices-bingsearch/bing-web-api-v7-reference#rankingresponse) answer, that specifies how you must display the search results. The ranking response groups results by mainline content and sidebar content for a traditional search results page. If you do not display the results in a traditional mainline and sidebar format, you must provide the mainline content higher visibility than the sidebar content.
cognitive-services Resize And Crop Thumbnails https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/resize-and-crop-thumbnails.md
# Resize and crop thumbnail images
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Some answers from the Bing Search APIs include URLs to thumbnail images served by Bing, which you can resize and crop, and may contain query parameters. For example:
cognitive-services Sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/sdk-samples.md
# Bing Web Search SDK samples
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Web Search SDK is available in Python, Node.js, C#, and Java. Code samples, prerequisites, and build instructions are provided on GitHub. The following scenarios are covered:
cognitive-services Search Responses https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/search-responses.md
# Bing Web Search API response structure and answer types
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
When you send Bing Web Search a search request, it returns a [`SearchResponse`](/rest/api/cognitiveservices-bingsearch/bing-web-api-v7-reference#searchresponse) object in the response body. The object includes a field for each answer that Bing determined was relevant to query. This example illustrates a response object if Bing returned all answers:
cognitive-services Throttling Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/throttling-requests.md
# Throttling requests to the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
[!INCLUDE [cognitive-services-bing-throttling-requests](../../../includes/cognitive-services-bing-throttling-requests.md)]
cognitive-services Tutorial Bing Web Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/tutorial-bing-web-search-single-page-app.md
# Tutorial: Create a single-page app using the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This single-page app demonstrates how to retrieve, parse, and display search results from the Bing Web Search API. The tutorial uses boilerplate HTML and CSS, and focuses on the JavaScript code. HTML, CSS, and JS files are available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/tree/master/Tutorials/Bing-Web-Search) with quickstart instructions.
cognitive-services Use Display Requirements https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/use-display-requirements.md
# Bing Search API use and display requirements
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
These use and display requirements apply to any implementation of the content and associated information from the following Bing Search APIs, including relationships, metadata, and other signals.
cognitive-services Web Search Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/web-search-endpoints.md
# Web Search endpoint
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The **Web Search API** returns Web pages, news, images, videos, and [entities](../bing-entities-search/overview.md). Entities have summary information about a person, place, or topic.
cognitive-services Luis Reference Regions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-reference-regions.md
The authoring region app can only be published to a corresponding publish region
| Asia | `westus`<br>[www.luis.ai][www.luis.ai]| East Asia<br>`eastasia` | `https://eastasia.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Japan East<br>`japaneast` | `https://japaneast.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Japan West<br>`japanwest` | `https://japanwest.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` |
+| Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Jio India West<br>`jioindiawest` | `https://jioindiawest.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` |
| Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Korea Central<br>`koreacentral` | `https://koreacentral.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Southeast Asia<br>`southeastasia` | `https://southeastasia.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| North UAE<br>`northuae` | `https://northuae.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` |
cognitive-services Add Sharepoint Datasources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/add-sharepoint-datasources.md
If the QnA Maker knowledge base manager is not the Active Directory manager, you
## Add supported file types to knowledge base
-You can add all QnA Maker-supported [file types](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types) from a SharePoint site to your knowledge base. You may have to grant [permissions](#permissions) if the file resource is secured.
+You can add all QnA Maker-supported [file types](/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types) from a SharePoint site to your knowledge base. You may have to grant [permissions](#permissions) if the file resource is secured.
1. From the library with the SharePoint site, select the file's ellipsis menu, `...`. 1. Copy the file's URL.
Use the **@microsoft.graph.downloadUrl** from the previous section as the `fileu
## Next steps > [!div class="nextstepaction"]
-> [Collaborate on your knowledge base](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types.yml)
+> [Collaborate on your knowledge base](/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types.yml)
cognitive-services Audio Processing Speech Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/audio-processing-speech-sdk.md
Title: Using the Microsoft Audio Stack (MAS) - Speech service
+ Title: Use the Microsoft Audio Stack (MAS) - Speech service
description: An overview of the features, capabilities, and restrictions for audio processing using the Speech Software Development Kit (SDK).
Previously updated : 12/27/2021 Last updated : 01/31/2022 ms.devlang: cpp, csharp, java
-# Using the Microsoft Audio Stack (MAS)
+# Use the Microsoft Audio Stack (MAS)
The Speech SDK integrates Microsoft Audio Stack (MAS), allowing any application or product to use its audio processing capabilities on input audio. See the [Audio processing](audio-processing-overview.md) documentation for an overview.
-In this article, you learn how to use the Speech SDK to leverage the Microsoft Audio Stack (MAS).
+In this article, you learn how to use the Microsoft Audio Stack (MAS) with the Speech SDK.
## Default options
SpeechRecognizer recognizer = new SpeechRecognizer(speechConfig, audioInput);
## Preset microphone geometry This sample shows how to use MAS with a predefined microphone geometry on a specified audio input device. In this example:
-* **Enhancement options** - The default enhancements will be applied on the input audio stream.
+* **Enhancement options** - The default enhancements are applied on the input audio stream.
* **Preset geometry** - The preset geometry represents a linear 2-microphone array. * **Audio input device** - The audio input device ID is `hw:0,1`. For more information on how to select an audio input device, see [How to: Select an audio input device with the Speech SDK](how-to-select-audio-input-devices.md).
SpeechRecognizer recognizer = new SpeechRecognizer(speechConfig, audioInput);
This sample shows how to use MAS with a custom microphone geometry on a specified audio input device. In this example: * **Enhancement options** - The default enhancements will be applied on the input audio stream.
-* **Custom geometry** - A custom microphone geometry for a 7-microphone array is provided by specifying the microphone coordinates. The units for coordinates are millimeters.
-* **Audio input** - The audio input is from a file, where the audio within the file is expected to be captured from an audio input device corresponding to the custom geometry specified.
+* **Custom geometry** - A custom microphone geometry for a 7-microphone array is provided via the microphone coordinates. The units for coordinates are millimeters.
+* **Audio input** - The audio input is from a file, where the audio within the file is expected from an audio input device corresponding to the custom geometry specified.
### [C#](#tab/csharp)
This sample shows how to use MAS with a custom microphone geometry and beamformi
* **Enhancement options** - The default enhancements will be applied on the input audio stream. * **Custom geometry** - A custom microphone geometry for a 4-microphone array is provided by specifying the microphone coordinates. The units for coordinates are millimeters. * **Beamforming angles** - Beamforming angles are specified to optimize for audio originating in that range. The units for angles are degrees. In the sample code below, the start angle is set to 70 degrees and the end angle is set to 110 degrees.
-* **Audio input** - The audio input is from a push stream, where the audio within the stream is expected to be captured from an audio input device corresponding to the custom geometry specified.
+* **Audio input** - The audio input is from a push stream, where the audio within the stream is expected from an audio input device corresponding to the custom geometry specified.
### [C#](#tab/csharp)
SpeechRecognizer recognizer = new SpeechRecognizer(speechConfig, audioInput);
Microsoft Audio Stack requires the reference channel (also known as loopback channel) to perform echo cancellation. The source of the reference channel varies by platform: * **Windows** - The reference channel is automatically gathered by the Speech SDK if the `SpeakerReferenceChannel::LastChannel` option is provided when creating `AudioProcessingOptions`.
-* **Linux** - ALSA (Advanced Linux Sound Architecture) will need to be configured to provide the reference audio stream as the last channel for the audio input device that will be used. This is in addition to providing the `SpeakerReferenceChannel::LastChannel` option when creating `AudioProcessingOptions`.
+* **Linux** - ALSA (Advanced Linux Sound Architecture) must be configured to provide the reference audio stream as the last channel for the audio input device used. ALSA is configured in addition to providing the `SpeakerReferenceChannel::LastChannel` option when creating `AudioProcessingOptions`.
## Language and platform support
-| Language | Platform(s) | Reference docs |
+| Language | Platform | Reference docs |
||-|-| | C++ | Windows, Linux | [C++ docs](/cpp/cognitive-services/speech/) | | C# | Windows, Linux | [C# docs](/dotnet/api/microsoft.cognitiveservices.speech) |
cognitive-services Get Started Speech Translation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/get-started-speech-translation.md
Title: Speech translation quickstart - Speech service
-description: Learn how to use the Speech SDK to translate speech. In this quickstart, you learn about object construction, supported audio input formats, and configuration options for speech translation.
+description: Learn how to use the Speech SDK to translate speech, including object construction, supported audio input formats, and configuration options.
keywords: speech translation
## Next steps
-* [Use codec compressed audio formats](how-to-use-codec-compressed-audio-input-streams.md)
+* Use [codec-compressed audio formats](how-to-use-codec-compressed-audio-input-streams.md).
* See the [quickstart samples](https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart) on GitHub.
cognitive-services How To Use Custom Entity Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-use-custom-entity-pattern-matching.md
Title: How to use custom entity pattern matching with the C++ Speech SDK
+ Title: How to use custom entity pattern matching with the Speech SDK
description: In this guide, you learn how to recognize intents and custom entities from simple patterns.
Last updated 11/15/2021 -
+ms.devlang: cpp, csharp
+zone_pivot_groups: programming-languages-set-nine
+
-# How to use custom entity pattern matching with the C++ Speech SDK
+# How to use custom entity pattern matching with the Speech SDK
The Cognitive Services [Speech SDK](speech-sdk.md) has a built-in feature to provide **intent recognition** with **simple language pattern matching**. An intent is something the user wants to do: close a window, mark a checkbox, insert some text, etc.
-In this guide, you use the Speech SDK to develop a C++ console application that derives intents from speech utterances spoken through your device's microphone. You'll learn how to:
+In this guide, you use the Speech SDK to develop a console application that derives intents from speech utterances spoken through your device's microphone. You'll learn how to:
> [!div class="checklist"] >
In this guide, you use the Speech SDK to develop a C++ console application that
## When should you use this?
-Use this sample code if:
-* You are only interested in matching very strictly what the user said. These patterns match more aggressively than LUIS.
-* You do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents. This can be helpful since it is embedded within the SDK.
-* You cannot or do not want to create a LUIS app but you still want some voice-commanding capability.
+Use this sample code if:
-If you do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents, this can be helpful since it is embedded within the SDK.
+- You are only interested in matching very strictly what the user said. These patterns match more aggressively than LUIS.
+- You do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents. This can be helpful since it is embedded within the SDK.
+- You cannot or do not want to create a LUIS app but you still want some voice-commanding capability.
+If you do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents, this can be helpful since it is embedded within the SDK.
## Prerequisites
Be sure you have the following items before you begin this guide:
[!INCLUDE [Pattern Matching Overview](includes/pattern-matching-overview.md)]
-## Create a speech project in Visual Studio
--
-## Open your project in Visual Studio
-
-Next, open your project in Visual Studio.
-
-1. Launch Visual Studio 2019.
-2. Load your project and open `helloworld.cpp`.
-
-## Start with some boilerplate code
-
-Let's add some code that works as a skeleton for our project.
-
-```cpp
- #include <iostream>
- #include <speechapi_cxx.h>
-
- using namespace Microsoft::Cognitive
- using namespace Microsoft::Cognitive
-
- int main()
- {
- std::cout << "Hello World!\n";
-
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- }
-```
-
-## Create a Speech configuration
-
-Before you can initialize an `IntentRecognizer` object, you need to create a configuration that uses the key and Azure region for your Cognitive Services prediction resource.
-
-* Replace `"YOUR_SUBSCRIPTION_KEY"` with your Cognitive Services prediction key.
-* Replace `"YOUR_SUBSCRIPTION_REGION"` with your Cognitive Services resource region.
-
-This sample uses the `FromSubscription()` method to build the `SpeechConfig`. For a full list of available methods, see [SpeechConfig Class](/cpp/cognitive-services/speech/speechconfig).
-
-## Initialize an IntentRecognizer
-
-Now create an `IntentRecognizer`. Insert this code right below your Speech configuration.
-
-```cpp
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-```
-
-## Add some intents
-
-You need to associate some patterns with a `PatternMatchingModel` and apply it to the `IntentRecognizer`.
-We will start by creating a `PatternMatchingModel` and adding a few intents to it. A PatternMatchingIntent is a struct so we will just use the in-line syntax.
-
-> [!Note]
-> We can add multiple patterns to an `Intent`.
-
-```cpp
- auto model = PatternMatchingModel::FromId("myNewModel");
-
- model->Intents.push_back({"Take me to floor {floorName}.", "Go to floor {floorName}."} , "ChangeFloors");
- model->Intents.push_back({"{action} the door."}, "OpenCloseDoor");
-```
-
-## Add some custom entities
-
-To take full advantage of the pattern matcher you can customize your entities. We will make "floorName" a list of the available floors.
-
-```cpp
- model->Entities.push_back({ "floorName" , Intent::EntityType::List, Intent::EntityMatchMode::Strict, {"one","two", "lobby", "ground floor"} });
-```
-
-## Apply our model to the Recognizer
-
-Now it is necessary to apply the model to the `IntentRecognizer`. It is possible to use multiple models at once so the API takes a collection of models.
-
-```cpp
- std::vector<std::shared_ptr<LanguageUnderstandingModel>> collecton;
-
- collection.push_back(model);
- intentRecognizer->ApplyLanguageModels(collection);
-```
-
-## Recognize an intent
-
-From the `IntentRecognizer` object, you're going to call the `RecognizeOnceAsync()` method. This method asks the Speech service to recognize speech in a single phrase, and stop recognizing speech once the phrase is identified. For simplicity we'll wait on the future returned to complete.
-
-Insert this code below your intents:
-
-```cpp
- std::cout << "Say something ..." << std::endl;
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-```
-
-## Display the recognition results (or errors)
-
-When the recognition result is returned by the Speech service, let's just print the result.
-
-Insert this code below `auto result = intentRecognizer->RecognizeOnceAsync().get();`:
-
-```cpp
-switch (result->Reason)
-{
-case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
-case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
-case ResultReason::NoMatch:
-{
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized" << std::endl;
- break;
- }
- break;
-}
-case ResultReason::Canceled:
-{
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
-}
-default:
- break;
-}
-```
-
-## Check your code
-
-At this point, your code should look like this:
-
-```cpp
-#include <iostream>
-#include <speechapi_cxx.h>
-
-using namespace Microsoft::Cognitive
-using namespace Microsoft::Cognitive
-
-int main()
-{
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-
- auto model = PatternMatchingModel::FromId("myNewModel");
-
- model->Intents.push_back({"Take me to floor {floorName}.", "Go to floor {floorName}."} , "ChangeFloors");
- model->Intents.push_back({"{action} the door."}, "OpenCloseDoor");
-
- model->Entities.push_back({ "floorName" , Intent::EntityType::List, Intent::EntityMatchMode::Strict, {"one","two", "lobby", "ground floor"} });
-
- std::vector<std::shared_ptr<LanguageUnderstandingModel>> collecton;
-
- collection.push_back(model);
- intentRecognizer->ApplyLanguageModels(collection);
-
- std::cout << "Say something ..." << std::endl;
-
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-
- switch (result->Reason)
- {
- case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
- case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
- case ResultReason::NoMatch:
- {
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized." << std::endl;
- break;
- }
- break;
- }
- case ResultReason::Canceled:
- {
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
- }
- default:
- break;
- }
-}
-```
-## Build and run your app
-
-Now you're ready to build your app and test our speech recognition using the Speech service.
-
-1. **Compile the code** - From the menu bar of Visual Studio, choose **Build** > **Build Solution**.
-2. **Start your app** - From the menu bar, choose **Debug** > **Start Debugging** or press <kbd>F5</kbd>.
-3. **Start recognition** - It will prompt you to say something. The default language is English. Your speech is sent to the Speech service, transcribed as text, and rendered in the console.
-
-For example if you say "Take me to floor 2", this should be the output:
-
-```
-Say something ...
-RECOGNIZED: Text = Take me to floor 2.
- Intent Id = ChangeFloors
- Floor name: = 2
-```
-
-Another example if you say "Take me to floor 7", this should be the output:
-
-```
-Say something ...
-RECOGNIZED: Text = Take me to floor 7.
-NO INTENT RECOGNIZED!
-```
-The Intent ID is empty because 7 was not in our list.
cognitive-services How To Use Simple Language Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-use-simple-language-pattern-matching.md
Last updated 11/15/2021 -
+zone_pivot_groups: programming-languages-set-nine
+ # How to use simple language pattern matching with the C++ Speech SDK
A pattern is a phrase that includes an Entity somewhere within it. An Entity is
Take me to the {floorName} ```
-This defines an Entity with the ID "floorName" which is case sensitive.
+This defines an Entity with the ID "floorName" which is case-sensitive.
All other special characters and punctuation will be ignored. Intents will be added using calls to the IntentRecognizer->AddIntent() API.
-## Create a speech project in Visual Studio
--
-Open your project in Visual Studio
-Next, open your project in Visual Studio.
-
-Launch Visual Studio 2019.
-Load your project and open helloworld.cpp.
-Start with some boilerplate code
-Let's add some code that works as a skeleton for our project. Make note that you've created an async method called recognizeIntent().
-
-## Open your project in Visual Studio
-
-Next, open your project in Visual Studio.
-
-1. Launch Visual Studio 2019.
-2. Load your project and open `helloworld.cpp`.
-
-## Start with some boilerplate code
-
-Let's add some code that works as a skeleton for our project.
-
-```cpp
- #include <iostream>
- #include <speechapi_cxx.h>
-
- using namespace Microsoft::Cognitive
- using namespace Microsoft::Cognitive
-
- int main()
- {
- std::cout << "Hello World!\n";
-
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- }
-```
-
-## Create a Speech configuration
-
-Before you can initialize an `IntentRecognizer` object, you need to create a configuration that uses the key and location for your Cognitive Services prediction resource.
-
-* Replace `"YOUR_SUBSCRIPTION_KEY"` with your Cognitive Services prediction key.
-* Replace `"YOUR_SUBSCRIPTION_REGION"` with your Cognitive Services resource region.
-
-This sample uses the `FromSubscription()` method to build the `SpeechConfig`. For a full list of available methods, see [SpeechConfig Class](/cpp/cognitive-services/speech/speechconfig).
-
-## Initialize an IntentRecognizer
-
-Now create an `IntentRecognizer`. Insert this code right below your Speech configuration.
-
-```cpp
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-```
-
-## Add some intents
-
-You need to associate some patterns with the `IntentRecognizer` by calling `AddIntent()`.
-We will add 2 intents with the same ID for changing floors, and another intent with a separate ID for opening and closing doors.
-
-```cpp
- intentRecognizer->AddIntent("Take me to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("Go to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("{action} the door.", "OpenCloseDoor");
-```
-
-> [!NOTE]
-> There is no limit to the number of entities you can declare, but they will be loosely matched. If you add a phrase like "{action} door" it will match any time there is text before the word "door". Intents are evaluated based on their number of entities. If two patterns would match, the one with more defined entities is returned.
-
-## Recognize an intent
-
-From the `IntentRecognizer` object, you're going to call the `RecognizeOnceAsync()` method. This method asks the Speech service to recognize speech in a single phrase, and stop recognizing speech once the phrase is identified. For simplicity we'll wait on the future returned to complete.
-
-Insert this code below your intents:
-
-```cpp
- std::cout << "Say something ..." << std::endl;
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-```
-
-## Display the recognition results (or errors)
-
-When the recognition result is returned by the Speech service, let's just print the result.
-
-Insert this code below `auto result = intentRecognizer->RecognizeOnceAsync().get();`:
-
-```cpp
-switch (result->Reason)
-{
-case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
-case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
-case ResultReason::NoMatch:
-{
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized" << std::endl;
- break;
- }
- break;
-}
-case ResultReason::Canceled:
-{
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
-}
-default:
- break;
-}
-```
-
-## Check your code
-
-At this point, your code should look like this:
-
-```cpp
-#include <iostream>
-#include <speechapi_cxx.h>
-
-using namespace Microsoft::Cognitive
-using namespace Microsoft::Cognitive
-
-int main()
-{
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-
- intentRecognizer->AddIntent("Take me to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("Go to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("{action} the door.", "OpenCloseDoor");
-
- std::cout << "Say something ..." << std::endl;
-
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-
- switch (result->Reason)
- {
- case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
- case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
- case ResultReason::NoMatch:
- {
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized." << std::endl;
- break;
- }
- break;
- }
- case ResultReason::Canceled:
- {
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
- }
- default:
- break;
- }
-}
-```
-## Build and run your app
-
-Now you're ready to build your app and test our speech recognition using the Speech service.
-
-1. **Compile the code** - From the menu bar of Visual Studio, choose **Build** > **Build Solution**.
-2. **Start your app** - From the menu bar, choose **Debug** > **Start Debugging** or press <kbd>F5</kbd>.
-3. **Start recognition** - It will prompt you to say something. The default language is English. Your speech is sent to the Speech service, transcribed as text, and rendered in the console.
-
-For example if you say "Take me to floor 7", this should be the output:
-
-```
-Say something ...
-RECOGNIZED: Text = Take me to floor 7.
- Intent Id = ChangeFloors
- Floor name: = 7
-```
-
-## Next steps
-
-> Improve your pattern matching by using [custom entities](how-to-use-custom-entity-pattern-matching.md).
cognitive-services Improve Accuracy Phrase List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/improve-accuracy-phrase-list.md
+
+ Title: Improve recognition accuracy with phrase list
+description: Phrase lists can be used to customize speech recognition results based on context.
+++++ Last updated : 01/26/2022
+zone_pivot_groups: programming-languages-set-two-with-js-spx
++
+# Improve recognition accuracy with phrase list
+
+A phrase list is a list of words or phrases provided ahead of time to help improve their recognition. Adding a phrase to a phrase list increases its importance, thus making it more likely to be recognized.
+
+Examples of phrases include:
+* Names
+* Geographical locations
+* Homonyms
+* Words or acronyms unique to your industry or organization
+
+Phrase lists are simple and lightweight:
+- **Just-in-time**: A phrase list is provided just before starting the speech recognition, eliminating the need to train a custom model.
+- **Lightweight**: You don't need a large data set. Simply provide a word or phrase to give it importance.
+
+You can use the Speech SDK or Speech Command Line Interface (CLI). The Batch transcription API does not support phrase lists.
+
+There are some situations where [training a custom model](custom-speech-overview.md) that includes phrases is likely the best option to improve accuracy. In these cases you would not use a phrase list:
+- If you need to use a large list of phrases. A phrase list shouldn't have more than 500 phrases.
+- If you need a phrase list for languages that are not currently supported. For supported phrase list locales see [Language and voice support for the Speech service](language-support.md#phrase-list).
+- If you use a custom endpoint. Phrase lists can't be used with custom endpoints.
+
+## Try it in Speech Studio
+
+You can use Speech Studio to test how phrase list would help improve recognition for your audio. To implement a phrase list with your application in production, you'll use the Speech SDK or Speech CLI.
+
+For example, let's say that you want the Speech service to recognize this sentence:
+"Hi Rehaan, this is Jessie from Contoso bank. "
+
+After testing, you might find that it's incorrectly recognized as:
+"Hi **everyone**, this is **Jesse** from **can't do so bank**."
+
+In this case you would want to add "Rehaan", "Jessie", and "Contoso" to your phrase list. Then the names should be recognized correctly.
+
+Now try Speech Studio to see how phrase list can improve recognition accuracy.
+
+> [!NOTE]
+> You may be prompted to select your Azure subscription and Speech resource, and then acknowledge billing for your region. If you are new to Azure or Speech, see [Try the Speech service for free](overview.md#try-the-speech-service-for-free).
+
+1. Sign in to [Speech Studio](https://speech.microsoft.com/).
+1. Select **Real-time Speech-to-text**.
+1. You test speech recognition by uploading an audio file or recording audio with a microphone. For example, select **record audio with a microphone** and then say "Hi Rehaan, this is Jessie from Contoso bank. " Then select the red button to stop recording.
+1. You should see the transcription result in the **Test results** text box. If "Rehaan", "Jesse", or "Contoso" were not recognized, you can add the terms to a phrase list in the next step.
+1. Select **Show advanced options** and turn on **Phrase list**.
+1. Enter "Contoso;Jessie;Rehaan" in the phrase list text box. Note that multiple phrases need to be separated by a semicolon.
+ :::image type="content" source="./media/custom-speech/phrase-list-after-zoom.png" alt-text="Screenshot of a phrase list applied in Speech Studio." lightbox="./media/custom-speech/phrase-list-after-full.png":::
+1. Use the microphone to test recognition again. Otherwise you can select the retry arrow next to your audio file to re-run your audio. The terms "Rehaan", "Jesse", or "Contoso" should be recognized.
+
+## Implement phrase list
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```csharp
+var phraseList = PhraseListGrammar.FromRecognizer(recognizer);
+phraseList.AddPhrase("Contoso;Jessie;Rehaan");
+phraseList.Clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```cpp
+auto phraseListGrammar = PhraseListGrammar::FromRecognizer(recognizer);
+phraseListGrammar->AddPhrase("Contoso;Jessie;Rehaan");
+phraseListGrammar->Clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```java
+PhraseListGrammar phraseList = PhraseListGrammar.fromRecognizer(recognizer);
+phraseList.addPhrase("Contoso;Jessie;Rehaan");
+phraseList.clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```javascript
+const phraseList = sdk.PhraseListGrammar.fromRecognizer(recognizer);
+phraseList.addPhrase("Contoso;Jessie;Rehaan");
+phraseList.clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```Python
+phrase_list_grammar = speechsdk.PhraseListGrammar.from_recognizer(reco)
+phrase_list_grammar.addPhrase("Contoso;Jessie;Rehaan")
+phrase_list_grammar.clear()
+```
+
+With the [Speech CLI](spx-overview.md) you can include a phrase list along with the recognize command.
+
+# [Terminal](#tab/terminal)
+
+Try recognition from a microphone or an audio file.
+
+```console
+spx recognize --microphone --phrases "Contoso;Jessie;Rehaan;"
+spx recognize --file "your\path\to\audio.wav" --phrases "Contoso;Jessie;Rehaan;"
+```
+
+You can also add a phrase list using a text file that contains one phrase per line
+
+```console
+spx recognize --microphone --phrases @phrases.txt
+spx recognize --file "your\path\to\audio.wav" --phrases @phrases.txt
+```
+
+# [PowerShell](#tab/powershell)
+
+Try recognition from a microphone or an audio file.
+
+```powershell
+spx --% recognize --microphone --phrases "Contoso;Jessie;Rehaan;"
+spx --% recognize --file "your\path\to\audio.wav" --phrases "Contoso;Jessie;Rehaan;"
+```
+
+You can also add a phrase list using a text file that contains one phrase per line
+
+```powershell
+spx --% recognize --microphone --phrases @phrases.txt
+spx --% recognize --file "your\path\to\audio.wav" --phrases @phrases.txt
+```
+
+***
++
+## Next steps
+
+Check out more options to improve recognition accuracy.
+
+> [!div class="nextstepaction"]
+> [Custom Speech](custom-speech-overview.md)
+
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/language-support.md
To improve accuracy, customization is available for some languages and baseline
| Turkish (Turkey) | `tr-TR` | Plain text | | Vietnamese (Vietnam) | `vi-VN` | Plain text |
+### Phrase list
+
+You can use the locales in this table with [phrase list](improve-accuracy-phrase-list.md).
+
+| Language | Locale |
+|||
+| Chinese (Mandarin, Simplified) | `zh-CN` |
+| English (Australia) | `en-AU` |
+| English (Canada) | `en-CA` |
+| English (India) | `en-IN` |
+| English (United Kingdom)) | `en-GB` |
+| English (United States) | `en-US` |
+| French (France) | `fr-FR` |
+| German (Germany) | `de-DE` |
+| Italian (Italy) | `it-IT` |
+| Japanese (Japan) | `ja-JP` |
+| Portuguese (Brazil) | `pt-BR` |
+| Spanish (Spain) | `es-ES` |
+ ## Text-to-speech Both the Microsoft Speech SDK and REST APIs support these neural voices, each of which supports a specific language and dialect, identified by locale. You can also get a full list of languages and voices supported for each specific region or endpoint through the [voices list API](rest-text-to-speech.md#get-a-list-of-voices).
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/overview.md
To add a Speech service resource to your Azure account by using the free or paid
1. Give a unique name for your new resource. The name helps you distinguish among multiple subscriptions tied to the same service. 1. Choose the Azure subscription that the new resource is associated with to determine how the fees are billed. Here's the introduction for [how to create an Azure subscription](../../cost-management-billing/manage/create-subscription.md#create-a-subscription-in-the-azure-portal) in the Azure portal.
- 1. Choose the [region](regions.md) where the resource will be used. Azure is a global cloud platform that's generally available in many regions worldwide. To get the best performance, select a region thatΓÇÖs closest to you or where your application runs. The Speech service availabilities vary among different regions. Make sure that you create your resource in a supported region. For more information, see [region support for Speech services](./regions.md#speech-to-text-text-to-speech-and-translation).
+ 1. Choose the [region](regions.md) where the resource will be used. Azure is a global cloud platform that's generally available in many regions worldwide. To get the best performance, select a region that's closest to you or where your application runs. The Speech service availabilities vary among different regions. Make sure that you create your resource in a supported region. For more information, see [region support for Speech services](./regions.md#speech-to-text-text-to-speech-and-translation).
1. Choose either a free (F0) or paid (S0) pricing tier. For complete information about pricing and usage quotas for each tier, select **View full pricing details** or see [Speech services pricing](https://azure.microsoft.com/pricing/details/cognitive-services/speech-services/). For limits on resources, see [Azure Cognitive Services limits](../../azure-resource-manager/management/azure-subscription-service-limits.md#azure-cognitive-services-limits). 1. Create a new resource group for this Speech subscription or assign the subscription to an existing resource group. Resource groups help you keep your various Azure subscriptions organized. 1. Select **Create**. This action takes you to the deployment overview and displays deployment progress messages.
After you've had a chance to get started with the Speech service, try our tutori
- [Tutorial: Recognize intents from speech with the Speech SDK and LUIS, C#](how-to-recognize-intents-from-speech-csharp.md) - [Tutorial: Voice enable your bot with the Speech SDK, C#](tutorial-voice-enable-your-bot-speech-sdk.md)-- [Tutorial: Build a Flask app to translate text, analyze sentiment, and synthesize translated text to speech, REST](../translator/tutorial-build-flask-app-translation-synthesis.md?bc=%2fazure%2fcognitive-services%2fspeech-service%2fbreadcrumb%2ftoc.json%252c%2fen-us%2fazure%2fbread%2ftoc.json&toc=%2fazure%2fcognitive-services%2fspeech-service%2ftoc.json%252c%2fen-us%2fazure%2fcognitive-services%2fspeech-service%2ftoc.json)
+- [Tutorial: Build a Flask app to translate text, analyze sentiment, and synthesize translated text to speech, REST](/learn/modules/python-flask-build-ai-web-app/)
## Get sample code
cognitive-services Releasenotes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/releasenotes.md
See below for information about changes to Speech services and resources.
## What's new?
+* Speech SDK 1.20.0 released January 2022. Updates include extended programming language support for DialogServiceConnector, Unity on Linux, enhancements to IntentRecognizer, added support for Python 3.10, and a fix to remove a 10-second delay while stopping a speech recognizer (when using a PushAudioInputStream, and no new audio is pushed in after StopContinuousRecognition is called).
+* Speech CLI 1.20.0 released January 2022. Updates include microphone input for Speaker recognition and expanded support for Intent recognition.
* Speaker Recognition service is generally available (GA). With [Speaker Recognition](./speaker-recognition-overview.md) you can accurately verify and identify speakers by their unique voice characteristics.
-* Speech SDK 1.19.0 release including Speaker Recognition support, Mac M1 ARM support, OpenSSL linking in Linux is dynamic, and Ubuntu 16.04 is no longer supported.
* Custom Neural Voice extended to support [49 locales](./language-support.md#custom-neural-voice). * Prebuilt Neural Voice added new [languages and variants](./language-support.md#prebuilt-neural-voices). * Commitment Tiers added to [pricing options](https://azure.microsoft.com/pricing/details/cognitive-services/speech-services/).
See below for information about changes to Speech services and resources.
[!INCLUDE [speech-cli](./includes/release-notes/release-notes-cli.md)]
-# [Text-to-speech](#tab/text-to-speech)
+# [Text-to-speech service](#tab/text-to-speech)
[!INCLUDE [text-to-speech](./includes/release-notes/release-notes-tts.md)]
-# [Speech-to-text](#tab/speech-to-text)
+# [Speech-to-text service](#tab/speech-to-text)
[!INCLUDE [speech-to-text](./includes/release-notes/release-notes-stt.md)]
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/language-support.md
Previously updated : 10/28/2021 Last updated : 02/01/2022 # Translator language support
**Dictionary:** Use the [Dictionary Lookup](reference/v3-0-dictionary-lookup.md) or [Dictionary Examples](reference/v3-0-dictionary-examples.md) operations from the Text Translation feature to display alternative translations from or to English and examples of words in context.
+## Translation
+ | Language | Language code | Cloud ΓÇô Text Translation and Document Translation| Containers ΓÇô Text Translation|Custom Translator|Auto Language Detection|Dictionary |:-|:-:|:-:|:-:|:-:|:-:|:-:| | Afrikaans | `af` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|
| Hungarian | `hu` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | Icelandic | `is` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | Indonesian | `id` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|
+| 🆕 </br> Inuinnaqtun | `ikt` |✔|||||
| Inuktitut | `iu` |Γ£ö|Γ£ö|Γ£ö|Γ£ö||
+| 🆕 </br> Inuktitut (Latin) | `iu-Latn` |✔|||||
| Irish | `ga` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|| | Italian | `it` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | Japanese | `ja` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|
cognitive-services Tutorial Build Flask App Translation Synthesis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/tutorial-build-flask-app-translation-synthesis.md
- Title: "Tutorial: Build a Flask app to translate, synthesize, and analyze text - Translator"-
-description: In this tutorial, you'll build a Flask-based web app to translate text, analyze sentiment, and synthesize translated text into speech.
------ Previously updated : 10/28/2021----
-# Tutorial: Build a Flask app with Azure Cognitive Services
-
-In this tutorial, you'll build a Flask web app that uses Azure Cognitive Services to translate text, analyze sentiment, and synthesize translated text into speech. Our focus is on the Python code and Flask routes that enable our application, however, we will help you out with the HTML and JavaScript that pulls the app together. If you run into any issues let us know using the feedback button below.
-
-Here's what this tutorial covers:
-
-> [!div class="checklist"]
-> * Get Azure subscription keys
-> * Set up your development environment and install dependencies
-> * Create a Flask app
-> * Use the Translator to translate text
-> * Use the Language service to analyze positive/negative sentiment of input text and translations
-> * Use the Speech Service to convert translated text into synthesized speech
-> * Run your Flask app locally
-
-> [!TIP]
-> If you'd like to skip ahead and see all the code at once, the entire sample, along with build instructions are available on [GitHub](https://github.com/MicrosoftTranslator/Text-Translation-API-V3-Flask-App-Tutorial).
-
-## What is Flask?
-
-Flask is a microframework for creating web applications. This means Flask provides you with tools, libraries, and technologies that allow you to build a web application. This web application can be some web pages, a blog, a wiki or go as substantive as a web-based calendar application or a commercial website.
-
-For those of you who want to deep dive after this tutorial here are a few helpful links:
-
-* [Flask documentation](http://flask.pocoo.org/)
-* [Flask for Dummies - A Beginner's Guide to Flask](https://codeburst.io/flask-for-dummies-a-beginners-guide-to-flask-part-uno-53aec6afc5b1)
-
-## Prerequisites
-
-Let's review the software and subscription keys that you'll need for this tutorial.
-
-* [Python 3.6 or later](https://www.python.org/downloads/)
-* [Git tools](https://git-scm.com/downloads)
-* An IDE or text editor, such as [Visual Studio Code](https://code.visualstudio.com/) or [Atom](https://atom.io/)
-* [Chrome](https://www.google.com/chrome/browser/) or [Firefox](https://www.mozilla.org/firefox)
-* A **Translator** subscription key (you can likely use the **global** location.)
-* A **Language service** subscription key in the **West US** region.
-* A **Speech Services** subscription key in the **West US** region.
-
-## Create an account and subscribe to resources
-
-As previously mentioned, you're going to need three subscription keys for this tutorial. This means that you need to create a resource within your Azure account for:
-* Translator
-* Language service
-* Speech Services
-
-Use [Create a Cognitive Services Account in the Azure portal](../cognitive-services-apis-create-account.md) for step-by-step instructions to create resources.
-
-> [!IMPORTANT]
-> For this tutorial, please create your resources in the West US region. If using a different region, you'll need to adjust the base URL in each of your Python files.
-
-## Set up your dev environment
-
-Before you build your Flask web app, you'll need to create a working directory for your project and install a few Python packages.
-
-### Create a working directory
-
-1. Open command line (Windows) or terminal (macOS/Linux). Then, create a working directory and sub directories for your project:
-
- ```
- mkdir -p flask-cog-services/static/scripts && mkdir flask-cog-services/templates
- ```
-2. Change to your project's working directory:
-
- ```
- cd flask-cog-services
- ```
-
-### Create and activate your virtual environment with `virtualenv`
-
-Let's create a virtual environment for our Flask app using `virtualenv`. Using a virtual environment ensures that you have a clean environment to work from.
-
-1. In your working directory, run this command to create a virtual environment:
- **macOS/Linux:**
- ```
- virtualenv venv --python=python3
- ```
- We've explicitly declared that the virtual environment should use Python 3. This ensures that users with multiple Python installations are using the correct version.
-
- **Windows CMD / Windows Bash:**
- ```
- virtualenv venv
- ```
- To keep things simple, we're naming your virtual environment venv.
-
-2. The commands to activate your virtual environment will vary depending on your platform/shell:
-
- | Platform | Shell | Command |
- |-|-||
- | macOS/Linux | bash/zsh | `source venv/bin/activate` |
- | Windows | bash | `source venv/Scripts/activate` |
- | | Command Line | `venv\Scripts\activate.bat` |
- | | PowerShell | `venv\Scripts\Activate.ps1` |
-
- After running this command, your command line or terminal session should be prefaced with `venv`.
-
-3. You can deactivate the session at any time by typing this into the command line or terminal: `deactivate`.
-
-> [!NOTE]
-> Python has extensive documentation for creating and managing virtual environments, see [virtualenv](https://virtualenv.pypa.io/en/latest/).
-
-### Install requests
-
-Requests is a popular module that is used to send HTTP 1.1 requests. There's no need to manually add query strings to your URLs, or to form-encode your POST data.
-
-1. To install requests, run:
-
- ```
- pip install requests
- ```
-
-> [!NOTE]
-> If you'd like to learn more about requests, see [Requests: HTTP for Humans](https://2.python-requests.org/en/master/).
-
-### Install and configure Flask
-
-Next we need to install Flask. Flask handles the routing for our web app, and allows us to make server-to-server calls that hide our subscription keys from the end user.
-
-1. To install Flask, run:
- ```
- pip install Flask
- ```
- Let's make sure Flask was installed. Run:
- ```
- flask --version
- ```
- The version should be printed to terminal. Anything else means something went wrong.
-
-2. To run the Flask app, you can either use the flask command or Python's -m switch with Flask. Before you can do that you need to tell your terminal which app to work with by exporting the `FLASK_APP` environment variable:
-
- **macOS/Linux**:
- ```
- export FLASK_APP=app.py
- ```
-
- **Windows**:
- ```
- set FLASK_APP=app.py
- ```
-
-## Create your Flask app
-
-In this section, you're going to create a barebones Flask app that returns an HTML file when users hit the root of your app. Don't spend too much time trying to pick apart the code, we'll come back to update this file later.
-
-### What is a Flask route?
-
-Let's take a minute to talk about "[routes](http://flask.pocoo.org/docs/1.0/api/#flask.Flask.route)". Routing is used to bind a URL to a specific function. Flask uses route decorators to register functions to specific URLs. For example, when a user navigates to the root (`/`) of our web app, `https://docsupdatetracker.net/index.html` is rendered.
-
-```python
-@app.route('/')
-def index():
- return render_template('https://docsupdatetracker.net/index.html')
-```
-
-Let's take a look at one more example to hammer this home.
-
-```python
-@app.route('/about')
-def about():
- return render_template('https://docsupdatetracker.net/about.html')
-```
-
-This code ensures that when a user navigates to `http://your-web-app.com/about` that the `https://docsupdatetracker.net/about.html` file is rendered.
-
-While these samples illustrate how to render html pages for a user, routes can also be used to call APIs when a button is pressed, or take any number of actions without having to navigate away from the homepage. You'll see this in action when you create routes for translation, sentiment, and speech synthesis.
-
-### Get started
-
-1. Open the project in your IDE, then create a file named `app.py` in the root of your working directory. Next, copy this code into `app.py` and save:
-
- ```python
- from flask import Flask, render_template, url_for, jsonify, request
-
- app = Flask(__name__)
- app.config['JSON_AS_ASCII'] = False
-
- @app.route('/')
- def index():
- return render_template('https://docsupdatetracker.net/index.html')
- ```
-
- This code block tells the app to display `https://docsupdatetracker.net/index.html` whenever a user navigates to the root of your web app (`/`).
-
-2. Next, let's create the front-end for our web app. Create a file named `https://docsupdatetracker.net/index.html` in the `templates` directory. Then copy this code into `templates/https://docsupdatetracker.net/index.html`.
-
- ```html
- <!doctype html>
- <html lang="en">
- <head>
- <!-- Required metadata tags -->
- <meta charset="utf-8">
- <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
- <meta name="description" content="Translate and analyze text with Azure Cognitive Services.">
- <!-- Bootstrap CSS -->
- <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/css/bootstrap.min.css" integrity="sha384-Gn5384xqQ1aoWXA+058RXPxPg6fy4IWvTNh0E263XmFcJlSAwiGgFAW/dAiS6JXm" crossorigin="anonymous">
- <title>Translate and analyze text with Azure Cognitive Services</title>
- </head>
- <body>
- <div class="container">
- <h1>Translate, synthesize, and analyze text with Azure</h1>
- <p>This simple web app uses Azure for text translation, text-to-speech conversion, and sentiment analysis of input text and translations. Learn more about <a href="https://docs.microsoft.com/azure/cognitive-services/">Azure Cognitive Services</a>.
- </p>
- <!-- HTML provided in the following sections goes here. -->
-
- <!-- End -->
- </div>
-
- <!-- Required Javascript for this tutorial -->
- <script src="https://code.jquery.com/jquery-3.2.1.slim.min.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
- <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
- <script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.12.9/umd/popper.min.js" integrity="sha384-ApNbgh9B+Y1QKtv3Rn7W3mgPxhU9K/ScQsAP7hUibX39j7fakFPskvXusvfa0b4Q" crossorigin="anonymous"></script>
- <script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/js/bootstrap.min.js" integrity="sha384-JZR6Spejh4U02d8jOt6vLEHfe/JQGiRRSQQxSfFWpi1MquVdAyjUar5+76PVCmYl" crossorigin="anonymous"></script>
- <script type = "text/javascript" src ="static/scripts/main.js"></script>
- </body>
- </html>
- ```
-
-3. Let's test the Flask app. From the terminal, run:
-
- ```
- flask run
- ```
-
-4. Open a browser and navigate to the URL provided. You should see your single page app. Press **Ctrl + C** to kill the app.
-
-## Translate text
-
-Now that you have an idea of how a simple Flask app works, let's:
-
-* Write some Python to call the Translator and return a response
-* Create a Flask route to call your Python code
-* Update the HTML with an area for text input and translation, a language selector, and translate button
-* Write JavaScript that allows users to interact with your Flask app from the HTML
-
-### Call the Translator
-
-The first thing you need to do is write a function to call the Translator. This function will take two arguments: `text_input` and `language_output`. This function is called whenever a user presses the translate button in your app. The text area in the HTML is sent as the `text_input`, and the language selection value in the HTML is sent as `language_output`.
-
-1. Let's start by creating a file called `translate.py` in the root of your working directory.
-2. Next, add this code to `translate.py`. This function takes two arguments: `text_input` and `language_output`.
- ```python
- import os, requests, uuid, json
-
- # Don't forget to replace with your Cog Services subscription key!
- # If you prefer to use environment variables, see Extra Credit for more info.
- subscription_key = 'YOUR_TRANSLATOR_TEXT_SUBSCRIPTION_KEY'
- location = 'YOUR_TRANSLATOR_RESOURCE_LOCATION'
- # Don't forget to replace with your Cog Services location!
- # Our Flask route will supply two arguments: text_input and language_output.
- # When the translate text button is pressed in our Flask app, the Ajax request
- # will grab these values from our web app, and use them in the request.
- # See main.js for Ajax calls.
- def get_translation(text_input, language_output):
- base_url = 'https://api.cognitive.microsofttranslator.com'
- path = '/translate?api-version=3.0'
- params = '&to=' + language_output
- constructed_url = base_url + path + params
-
- headers = {
- 'Ocp-Apim-Subscription-Key': subscription_key,
- 'Ocp-Apim-Subscription-Region': location,
- 'Content-type': 'application/json',
- 'X-ClientTraceId': str(uuid.uuid4())
- }
-
- # You can pass more than one object in body.
- body = [{
- 'text' : text_input
- }]
- response = requests.post(constructed_url, headers=headers, json=body)
- return response.json()
- ```
-3. Add your Translator subscription key and save.
-
-### Add a route to `app.py`
-
-Next, you'll need to create a route in your Flask app that calls `translate.py`. This route will be called each time a user presses the translate button in your app.
-
-For this app, your route is going to accept `POST` requests. This is because the function expects the text to translate and an output language for the translation.
-
-Flask provides helper functions to help you parse and manage each request. In the code provided, `get_json()` returns the data from the `POST` request as JSON. Then using `data['text']` and `data['to']`, the text and output language values are passed to `get_translation()` function available from `translate.py`. The last step is to return the response as JSON, since you'll need to display this data in your web app.
-
-In the following sections, you'll repeat this process as you create routes for sentiment analysis and speech synthesis.
-
-1. Open `app.py` and locate the import statement at the top of `app.py` and add the following line:
-
- ```python
- import translate
- ```
- Now our Flask app can use the method available via `translate.py`.
-
-2. Copy this code to the end of `app.py` and save:
-
- ```python
- @app.route('/translate-text', methods=['POST'])
- def translate_text():
- data = request.get_json()
- text_input = data['text']
- translation_output = data['to']
- response = translate.get_translation(text_input, translation_output)
- return jsonify(response)
- ```
-
-### Update `https://docsupdatetracker.net/index.html`
-
-Now that you have a function to translate text, and a route in your Flask app to call it, the next step is to start building the HTML for your app. The HTML below does a few things:
-
-* Provides a text area where users can input text to translate.
-* Includes a language selector.
-* Includes HTML elements to render the detected language and confidence scores returned during translation.
-* Provides a read-only text area where the translation output is displayed.
-* Includes placeholders for sentiment analysis and speech synthesis code that you'll add to this file later in the tutorial.
-
-Let's update `https://docsupdatetracker.net/index.html`.
-
-1. Open `https://docsupdatetracker.net/index.html` and locate these code comments:
- ```html
- <!-- HTML provided in the following sections goes here. -->
-
- <!-- End -->
- ```
-
-2. Replace the code comments with this HTML block:
- ```html
- <div class="row">
- <div class="col">
- <form>
- <!-- Enter text to translate. -->
- <div class="form-group">
- <label for="text-to-translate"><strong>Enter the text you'd like to translate:</strong></label>
- <textarea class="form-control" id="text-to-translate" rows="5"></textarea>
- </div>
- <!-- Select output language. -->
- <div class="form-group">
- <label for="select-language"><strong>Translate to:</strong></label>
- <select class="form-control" id="select-language">
- <option value="ar">Arabic</option>
- <option value="ca">Catalan</option>
- <option value="zh-Hans">Chinese (Simplified)</option>
- <option value="zh-Hant">Chinese (Traditional)</option>
- <option value="hr">Croatian</option>
- <option value="en">English</option>
- <option value="fr">French</option>
- <option value="de">German</option>
- <option value="el">Greek</option>
- <option value="he">Hebrew</option>
- <option value="hi">Hindi</option>
- <option value="it">Italian</option>
- <option value="ja">Japanese</option>
- <option value="ko">Korean</option>
- <option value="pt">Portuguese</option>
- <option value="ru">Russian</option>
- <option value="es">Spanish</option>
- <option value="th">Thai</option>
- <option value="tr">Turkish</option>
- <option value="vi">Vietnamese</option>
- </select>
- </div>
- <button type="submit" class="btn btn-primary mb-2" id="translate">Translate text</button></br>
- <div id="detected-language" style="display: none">
- <strong>Detected language:</strong> <span id="detected-language-result"></span><br />
- <strong>Detection confidence:</strong> <span id="confidence"></span><br /><br />
- </div>
-
- <!-- Start sentiment code-->
-
- <!-- End sentiment code -->
-
- </form>
- </div>
- <div class="col">
- <!-- Translated text returned by the Translate API is rendered here. -->
- <form>
- <div class="form-group" id="translator-text-response">
- <label for="translation-result"><strong>Translated text:</strong></label>
- <textarea readonly class="form-control" id="translation-result" rows="5"></textarea>
- </div>
-
- <!-- Start voice font selection code -->
-
- <!-- End voice font selection code -->
-
- </form>
-
- <!-- Add Speech Synthesis button and audio element -->
-
- <!-- End Speech Synthesis button -->
-
- </div>
- </div>
- ```
-
-The next step is to write some JavaScript. This is the bridge between your HTML and Flask route.
-
-### Create `main.js`
-
-The `main.js` file is the bridge between your HTML and Flask route. Your app will use a combination of jQuery, Ajax, and XMLHttpRequest to render content, and make `POST` requests to your Flask routes.
-
-In the code below, content from the HTML is used to construct a request to your Flask route. Specifically, the contents of the text area and the language selector are assigned to variables, and then passed along in the request to `translate-text`.
-
-The code then iterates through the response, and updates the HTML with the translation, detected language, and confidence score.
-
-1. From your IDE, create a file named `main.js` in the `static/scripts` directory.
-2. Copy this code into `static/scripts/main.js`:
- ```javascript
- //Initiate jQuery on load.
- $(function() {
- //Translate text with flask route
- $("#translate").on("click", function(e) {
- e.preventDefault();
- var translateVal = document.getElementById("text-to-translate").value;
- var languageVal = document.getElementById("select-language").value;
- var translateRequest = { 'text': translateVal, 'to': languageVal }
-
- if (translateVal !== "") {
- $.ajax({
- url: '/translate-text',
- method: 'POST',
- headers: {
- 'Content-Type':'application/json'
- },
- dataType: 'json',
- data: JSON.stringify(translateRequest),
- success: function(data) {
- for (var i = 0; i < data.length; i++) {
- document.getElementById("translation-result").textContent = data[i].translations[0].text;
- document.getElementById("detected-language-result").textContent = data[i].detectedLanguage.language;
- if (document.getElementById("detected-language-result").textContent !== ""){
- document.getElementById("detected-language").style.display = "block";
- }
- document.getElementById("confidence").textContent = data[i].detectedLanguage.score;
- }
- }
- });
- };
- });
- // In the following sections, you'll add code for sentiment analysis and
- // speech synthesis here.
- })
- ```
-
-### Test translation
-
-Let's test translation in the app.
-
-```
-flask run
-```
-
-Navigate to the provided server address. Type text into the input area, select a language, and press translate. You should get a translation. If it doesn't work, make sure that you've added your subscription key.
-
-> [!TIP]
-> If the changes you've made aren't showing up, or the app doesn't work the way you expect it to, try clearing your cache or opening a private/incognito window.
-
-Press **CTRL + c** to kill the app, then head to the next section.
-
-## Analyze sentiment
-
-The [Language service API](../language-service/overview.md) can be used to perform sentiment analysis, extract key phrases from text, or detect the source language. In this app, we're going to use sentiment analysis to determine if the provided text is positive, neutral, or negative. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, and scores close to 0 indicate negative sentiment.
-
-In this section, you're going to do a few things:
-
-* Write some Python to call the Langauge service API to perform sentiment analysis and return a response
-* Create a Flask route to call your Python code
-* Update the HTML with an area for sentiment scores, and a button to perform analysis
-* Write JavaScript that allows users to interact with your Flask app from the HTML
-
-### Call the Language service API
-
-Let's write a function to call the Language service API. This function will take four arguments: `input_text`, `input_language`, `output_text`, and `output_language`. This function is called whenever a user presses the run sentiment analysis button in your app. Data provided by the user from the text area and language selector, as well as the detected language and translation output are provided with each request. The response object includes sentiment scores for the source and translation. In the following sections, you're going to write some JavaScript to parse the response and use it in your app. For now, let's focus on call the Language service API.
-
-1. Let's create a file called `sentiment.py` in the root of your working directory.
-2. Next, add this code to `sentiment.py`.
- ```python
- import os, requests, uuid, json
-
- # Don't forget to replace with your Cog Services subscription key!
- subscription_key = 'YOUR_TEXT_ANALYTICS_SUBSCRIPTION_KEY'
- endpoint = "YOUR_TEXT_ANALYTICS_ENDPOINT"
- # Our Flask route will supply four arguments: input_text, input_language,
- # output_text, output_language.
- # When the run sentiment analysis button is pressed in our Flask app,
- # the Ajax request will grab these values from our web app, and use them
- # in the request. See main.js for Ajax calls.
-
- def get_sentiment(input_text, input_language):
- path = '/text/analytics/v3.0/sentiment'
- constructed_url = endpoint + path
-
- headers = {
- 'Ocp-Apim-Subscription-Key': subscription_key,
- 'Content-type': 'application/json',
- 'X-ClientTraceId': str(uuid.uuid4())
- }
-
- # You can pass more than one object in body.
- body = {
- 'documents': [
- {
- 'language': input_language,
- 'id': '1',
- 'text': input_text
- },
- ]
- }
- response = requests.post(constructed_url, headers=headers, json=body)
- return response.json()
- ```
-3. Add your Language service subscription key and save.
-
-### Add a route to `app.py`
-
-Let's create a route in your Flask app that calls `sentiment.py`. This route will be called each time a user presses the run sentiment analysis button in your app. Like the route for translation, this route is going to accept `POST` requests since the function expects arguments.
-
-1. Open `app.py` and locate the import statement at the top of `app.py` and update it:
-
- ```python
- import translate, sentiment
- ```
- Now our Flask app can use the method available via `sentiment.py`.
-
-2. Copy this code to the end of `app.py` and save:
- ```python
- @app.route('/sentiment-analysis', methods=['POST'])
- def sentiment_analysis():
- data = request.get_json()
- input_text = data['inputText']
- input_lang = data['inputLanguage']
- response = sentiment.get_sentiment(input_text, input_lang)
- return jsonify(response)
- ```
-
-### Update `https://docsupdatetracker.net/index.html`
-
-Now that you have a function to run sentiment analysis, and a route in your Flask app to call it, the next step is to start writing the HTML for your app. The HTML below does a few things:
-
-* Adds a button to your app to run sentiment analysis
-* Adds an element that explains sentiment scoring
-* Adds an element to display the sentiment scores
-
-1. Open `https://docsupdatetracker.net/index.html` and locate these code comments:
- ```html
- <!-- Start sentiment code-->
-
- <!-- End sentiment code -->
- ```
-
-2. Replace the code comments with this HTML block:
- ```html
- <button type="submit" class="btn btn-primary mb-2" id="sentiment-analysis">Run sentiment analysis</button></br>
- <div id="sentiment" style="display: none">
- <p>Sentiment can be labeled as "positive", "negative", "neutral", or "mixed". </p>
- <strong>Sentiment label for input:</strong> <span id="input-sentiment"></span><br />
- </div>
- ```
-
-### Update `main.js`
-
-In the code below, content from the HTML is used to construct a request to your Flask route. Specifically, the contents of the text area and the language selector are assigned to variables, and then passed along in the request to the `sentiment-analysis` route.
-
-The code then iterates through the response, and updates the HTML with the sentiment scores.
-
-1. From your IDE, create a file named `main.js` in the `static` directory.
-
-2. Copy this code into `static/scripts/main.js`:
- ```javascript
- //Run sentiment analysis on input and translation.
- $("#sentiment-analysis").on("click", function(e) {
- e.preventDefault();
- var inputText = document.getElementById("text-to-translate").value;
- var inputLanguage = document.getElementById("detected-language-result").innerHTML;
- var outputText = document.getElementById("translation-result").value;
- var outputLanguage = document.getElementById("select-language").value;
-
- var sentimentRequest = { "inputText": inputText, "inputLanguage": inputLanguage};
-
- if (inputText !== "") {
- $.ajax({
- url: "/sentiment-analysis",
- method: "POST",
- headers: {
- "Content-Type":"application/json"
- },
- dataType: "json",
- data: JSON.stringify(sentimentRequest),
- success: function(data) {
- for (var i = 0; i < data.documents.length; i++) {
- if (typeof data.documents[i] !== "undefined"){
- if (data.documents[i].id === "1") {
- document.getElementById("input-sentiment").textContent = data.documents[i].sentiment;
- }
- }
- }
- for (var i = 0; i < data.errors.length; i++) {
- if (typeof data.errors[i] !== "undefined"){
- if (data.errors[i].id === "1") {
- document.getElementById("input-sentiment").textContent = data.errors[i].message;
- }
- }
- }
- if (document.getElementById("input-sentiment").textContent !== ''){
- document.getElementById("sentiment").style.display = "block";
- }
- }
- });
- }
- });
- // In the next section, you'll add code for speech synthesis here.
- ```
-
-### Test sentiment analysis
-
-Let's test sentiment analysis in the app.
-
-```
-flask run
-```
-
-Navigate to the provided server address. Type text into the input area, select a language, and press translate. You should get a translation. Next, press the run sentiment analysis button. You should see two scores. If it doesn't work, make sure that you've added your subscription key.
-
-> [!TIP]
-> If the changes you've made aren't showing up, or the app doesn't work the way you expect it to, try clearing your cache or opening a private/incognito window.
-
-Press **CTRL + c** to kill the app, then head to the next section.
-
-## Convert text-to-speech
-
-The [Text-to-speech API](../speech-service/text-to-speech.md) enables your app to convert text into natural human-like synthesized speech. The service supports standard, neural, and custom voices. Our sample app uses a handful of the available voices, for a full list, see [supported languages](../speech-service/language-support.md#text-to-speech).
-
-In this section, you're going to do a few things:
-
-* Write some Python to convert text-to-speech with the Text-to-speech API
-* Create a Flask route to call your Python code
-* Update the HTML with a button to convert text-to-speech, and an element for audio playback
-* Write JavaScript that allows users to interact with your Flask app
-
-### Call the Text-to-Speech API
-
-Let's write a function to convert text-to-speech. This function will take two arguments: `input_text` and `voice_font`. This function is called whenever a user presses the convert text-to-speech button in your app. `input_text` is the translation output returned by the call to translate text, `voice_font` is the value from the voice font selector in the HTML.
-
-1. Let's create a file called `synthesize.py` in the root of your working directory.
-
-2. Next, add this code to `synthesize.py`.
- ```Python
- import os, requests, time
- from xml.etree import ElementTree
-
- class TextToSpeech(object):
- def __init__(self, input_text, voice_font):
- subscription_key = 'YOUR_SPEECH_SERVICES_SUBSCRIPTION_KEY'
- self.subscription_key = subscription_key
- self.input_text = input_text
- self.voice_font = voice_font
- self.timestr = time.strftime('%Y%m%d-%H%M')
- self.access_token = None
-
- # This function performs the token exchange.
- def get_token(self):
- fetch_token_url = 'https://westus.api.cognitive.microsoft.com/sts/v1.0/issueToken'
- headers = {
- 'Ocp-Apim-Subscription-Key': self.subscription_key
- }
- response = requests.post(fetch_token_url, headers=headers)
- self.access_token = str(response.text)
-
- # This function calls the TTS endpoint with the access token.
- def save_audio(self):
- base_url = 'https://westus.tts.speech.microsoft.com/'
- path = 'cognitiveservices/v1'
- constructed_url = base_url + path
- headers = {
- 'Authorization': 'Bearer ' + self.access_token,
- 'Content-Type': 'application/ssml+xml',
- 'X-Microsoft-OutputFormat': 'riff-24khz-16bit-mono-pcm',
- 'User-Agent': 'YOUR_RESOURCE_NAME',
- }
- # Build the SSML request with ElementTree
- xml_body = ElementTree.Element('speak', version='1.0')
- xml_body.set('{http://www.w3.org/XML/1998/namespace}lang', 'en-us')
- voice = ElementTree.SubElement(xml_body, 'voice')
- voice.set('{http://www.w3.org/XML/1998/namespace}lang', 'en-US')
- voice.set('name', 'Microsoft Server Speech Text to Speech Voice {}'.format(self.voice_font))
- voice.text = self.input_text
- # The body must be encoded as UTF-8 to handle non-ascii characters.
- body = ElementTree.tostring(xml_body, encoding="utf-8")
-
- #Send the request
- response = requests.post(constructed_url, headers=headers, data=body)
-
- # Write the response as a wav file for playback. The file is located
- # in the same directory where this sample is run.
- return response.content
- ```
-3. Add your Speech Services subscription key and save.
-
-### Add a route to `app.py`
-
-Let's create a route in your Flask app that calls `synthesize.py`. This route will be called each time a user presses the convert text-to-speech button in your app. Like the routes for translation and sentiment analysis, this route is going to accept `POST` requests since the function expects two arguments: the text to synthesize, and the voice font for playback.
-
-1. Open `app.py` and locate the import statement at the top of `app.py` and update it:
-
- ```python
- import translate, sentiment, synthesize
- ```
- Now our Flask app can use the method available via `synthesize.py`.
-
-2. Copy this code to the end of `app.py` and save:
-
- ```Python
- @app.route('/text-to-speech', methods=['POST'])
- def text_to_speech():
- data = request.get_json()
- text_input = data['text']
- voice_font = data['voice']
- tts = synthesize.TextToSpeech(text_input, voice_font)
- tts.get_token()
- audio_response = tts.save_audio()
- return audio_response
- ```
-
-### Update `https://docsupdatetracker.net/index.html`
-
-Now that you have a function to convert text-to-speech, and a route in your Flask app to call it, the next step is to start writing the HTML for your app. The HTML below does a few things:
-
-* Provides a voice selection drop-down
-* Adds a button to convert text-to-speech
-* Adds an audio element, which is used to play back the synthesized speech
-
-1. Open `https://docsupdatetracker.net/index.html` and locate these code comments:
- ```html
- <!-- Start voice font selection code -->
-
- <!-- End voice font selection code -->
- ```
-
-2. Replace the code comments with this HTML block:
- ```html
- <div class="form-group">
- <label for="select-voice"><strong>Select voice font:</strong></label>
- <select class="form-control" id="select-voice">
- <option value="(ar-SA, Naayf)">Arabic | Male | Naayf</option>
- <option value="(ca-ES, HerenaRUS)">Catalan | Female | HerenaRUS</option>
- <option value="(zh-CN, HuihuiRUS)">Chinese (Mainland) | Female | HuihuiRUS</option>
- <option value="(zh-CN, Kangkang, Apollo)">Chinese (Mainland) | Male | Kangkang, Apollo</option>
- <option value="(zh-HK, Tracy, Apollo)">Chinese (Hong Kong)| Female | Tracy, Apollo</option>
- <option value="(zh-HK, Danny, Apollo)">Chinese (Hong Kong) | Male | Danny, Apollo</option>
- <option value="(zh-TW, Yating, Apollo)">Chinese (Taiwan)| Female | Yating, Apollo</option>
- <option value="(zh-TW, Zhiwei, Apollo)">Chinese (Taiwan) | Male | Zhiwei, Apollo</option>
- <option value="(hr-HR, Matej)">Croatian | Male | Matej</option>
- <option value="(en-US, AriaRUS)">English (US) | Female | AriaRUS</option>
- <option value="(en-US, Guy24kRUS)">English (US) | Male | Guy24kRUS</option>
- <option value="(en-IE, Sean)">English (IE) | Male | Sean</option>
- <option value="(fr-FR, Julie, Apollo)">French | Female | Julie, Apollo</option>
- <option value="(fr-FR, HortenseRUS)">French | Female | Julie, HortenseRUS</option>
- <option value="(fr-FR, Paul, Apollo)">French | Male | Paul, Apollo</option>
- <option value="(de-DE, Hedda)">German | Female | Hedda</option>
- <option value="(de-DE, HeddaRUS)">German | Female | HeddaRUS</option>
- <option value="(de-DE, Stefan, Apollo)">German | Male | Apollo</option>
- <option value="(el-GR, Stefanos)">Greek | Male | Stefanos</option>
- <option value="(he-IL, Asaf)">Hebrew (Isreal) | Male | Asaf</option>
- <option value="(hi-IN, Kalpana, Apollo)">Hindi | Female | Kalpana, Apollo</option>
- <option value="(hi-IN, Hemant)">Hindi | Male | Hemant</option>
- <option value="(it-IT, LuciaRUS)">Italian | Female | LuciaRUS</option>
- <option value="(it-IT, Cosimo, Apollo)">Italian | Male | Cosimo, Apollo</option>
- <option value="(ja-JP, Ichiro, Apollo)">Japanese | Male | Ichiro</option>
- <option value="(ja-JP, HarukaRUS)">Japanese | Female | HarukaRUS</option>
- <option value="(ko-KR, HeamiRUS)">Korean | Female | Heami</option>
- <option value="(pt-BR, HeloisaRUS)">Portuguese (Brazil) | Female | HeloisaRUS</option>
- <option value="(pt-BR, Daniel, Apollo)">Portuguese (Brazil) | Male | Daniel, Apollo</option>
- <option value="(pt-PT, HeliaRUS)">Portuguese (Portugal) | Female | HeliaRUS</option>
- <option value="(ru-RU, Irina, Apollo)">Russian | Female | Irina, Apollo</option>
- <option value="(ru-RU, Pavel, Apollo)">Russian | Male | Pavel, Apollo</option>
- <option value="(ru-RU, EkaterinaRUS)">Russian | Female | EkaterinaRUS</option>
- <option value="(es-ES, Laura, Apollo)">Spanish | Female | Laura, Apollo</option>
- <option value="(es-ES, HelenaRUS)">Spanish | Female | HelenaRUS</option>
- <option value="(es-ES, Pablo, Apollo)">Spanish | Male | Pablo, Apollo</option>
- <option value="(th-TH, Pattara)">Thai | Male | Pattara</option>
- <option value="(tr-TR, SedaRUS)">Turkish | Female | SedaRUS</option>
- <option value="(vi-VN, An)">Vietnamese | Male | An</option>
- </select>
- </div>
- ```
-
-3. Next, locate these code comments:
- ```html
- <!-- Add Speech Synthesis button and audio element -->
-
- <!-- End Speech Synthesis button -->
- ```
-
-4. Replace the code comments with this HTML block:
-
-```html
-<button type="submit" class="btn btn-primary mb-2" id="text-to-speech">Convert text-to-speech</button>
-<div id="audio-playback">
- <audio id="audio" controls>
- <source id="audio-source" type="audio/mpeg" />
- </audio>
-</div>
-```
-
-5. Make sure to save your work.
-
-### Update `main.js`
-
-In the code below, content from the HTML is used to construct a request to your Flask route. Specifically, the translation and the voice font are assigned to variables, and then passed along in the request to the `text-to-speech` route.
-
-The code then iterates through the response, and updates the HTML with the sentiment scores.
-
-1. From your IDE, create a file named `main.js` in the `static` directory.
-2. Copy this code into `static/scripts/main.js`:
- ```javascript
- // Convert text-to-speech
- $("#text-to-speech").on("click", function(e) {
- e.preventDefault();
- var ttsInput = document.getElementById("translation-result").value;
- var ttsVoice = document.getElementById("select-voice").value;
- var ttsRequest = { 'text': ttsInput, 'voice': ttsVoice }
-
- var xhr = new XMLHttpRequest();
- xhr.open("post", "/text-to-speech", true);
- xhr.setRequestHeader("Content-Type", "application/json");
- xhr.responseType = "blob";
- xhr.onload = function(evt){
- if (xhr.status === 200) {
- audioBlob = new Blob([xhr.response], {type: "audio/mpeg"});
- audioURL = URL.createObjectURL(audioBlob);
- if (audioURL.length > 5){
- var audio = document.getElementById("audio");
- var source = document.getElementById("audio-source");
- source.src = audioURL;
- audio.load();
- audio.play();
- }else{
- console.log("An error occurred getting and playing the audio.")
- }
- }
- }
- xhr.send(JSON.stringify(ttsRequest));
- });
- // Code for automatic language selection goes here.
- ```
-3. You're almost done. The last thing you're going to do is add some code to `main.js` to automatically select a voice font based on the language selected for translation. Add this code block to `main.js`:
- ```javascript
- // Automatic voice font selection based on translation output.
- $('select[id="select-language"]').change(function(e) {
- if ($(this).val() == "ar"){
- document.getElementById("select-voice").value = "(ar-SA, Naayf)";
- }
- if ($(this).val() == "ca"){
- document.getElementById("select-voice").value = "(ca-ES, HerenaRUS)";
- }
- if ($(this).val() == "zh-Hans"){
- document.getElementById("select-voice").value = "(zh-HK, Tracy, Apollo)";
- }
- if ($(this).val() == "zh-Hant"){
- document.getElementById("select-voice").value = "(zh-HK, Tracy, Apollo)";
- }
- if ($(this).val() == "hr"){
- document.getElementById("select-voice").value = "(hr-HR, Matej)";
- }
- if ($(this).val() == "en"){
- document.getElementById("select-voice").value = "(en-US, Jessa24kRUS)";
- }
- if ($(this).val() == "fr"){
- document.getElementById("select-voice").value = "(fr-FR, HortenseRUS)";
- }
- if ($(this).val() == "de"){
- document.getElementById("select-voice").value = "(de-DE, HeddaRUS)";
- }
- if ($(this).val() == "el"){
- document.getElementById("select-voice").value = "(el-GR, Stefanos)";
- }
- if ($(this).val() == "he"){
- document.getElementById("select-voice").value = "(he-IL, Asaf)";
- }
- if ($(this).val() == "hi"){
- document.getElementById("select-voice").value = "(hi-IN, Kalpana, Apollo)";
- }
- if ($(this).val() == "it"){
- document.getElementById("select-voice").value = "(it-IT, LuciaRUS)";
- }
- if ($(this).val() == "ja"){
- document.getElementById("select-voice").value = "(ja-JP, HarukaRUS)";
- }
- if ($(this).val() == "ko"){
- document.getElementById("select-voice").value = "(ko-KR, HeamiRUS)";
- }
- if ($(this).val() == "pt"){
- document.getElementById("select-voice").value = "(pt-BR, HeloisaRUS)";
- }
- if ($(this).val() == "ru"){
- document.getElementById("select-voice").value = "(ru-RU, EkaterinaRUS)";
- }
- if ($(this).val() == "es"){
- document.getElementById("select-voice").value = "(es-ES, HelenaRUS)";
- }
- if ($(this).val() == "th"){
- document.getElementById("select-voice").value = "(th-TH, Pattara)";
- }
- if ($(this).val() == "tr"){
- document.getElementById("select-voice").value = "(tr-TR, SedaRUS)";
- }
- if ($(this).val() == "vi"){
- document.getElementById("select-voice").value = "(vi-VN, An)";
- }
- });
- ```
-
-### Test your app
-
-Let's test speech synthesis in the app.
-
-```
-flask run
-```
-
-Navigate to the provided server address. Type text into the input area, select a language, and press translate. You should get a translation. Next, select a voice, then press the convert text-to-speech button. the translation should be played back as synthesized speech. If it doesn't work, make sure that you've added your subscription key.
-
-> [!TIP]
-> If the changes you've made aren't showing up, or the app doesn't work the way you expect it to, try clearing your cache or opening a private/incognito window.
-
-That's it, you have a working app that performs translations, analyzes sentiment, and synthesized speech. Press **CTRL + c** to kill the app. Be sure to check out the other [Azure Cognitive Services](../index.yml).
-
-## Get the source code
-
-The source code for this project is available on [GitHub](https://github.com/MicrosoftTranslator/Text-Translation-API-V3-Flask-App-Tutorial).
-
-## Next steps
-
-* [Translator reference](./reference/v3-0-reference.md)
-* [Language service API reference](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1)
-* [Text-to-speech API reference](../speech-service/rest-text-to-speech.md)
cognitive-services Tutorial Wpf Translation Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/tutorial-wpf-translation-csharp.md
- Title: "Tutorial: Create a translation app with WPF, C# - Translator"-
-description: In this tutorial, you'll create a WPF app to perform text translation, language detection, and spell checking with a single subscription key.
------ Previously updated : 05/26/2020----
-# Tutorial: Create a translation app with WPF
-
-In this tutorial, you'll build a [Windows Presentation Foundation (WPF)](/visualstudio/designers/getting-started-with-wpf) app that uses Azure Cognitive Services for text translation, language detection, and spell checking with a single subscription key. Specifically, your app will call APIs from the Translator and [Bing Spell Check](https://azure.microsoft.com/services/cognitive-services/spell-check/).
-
-What is WPF? It's a UI framework that creates desktop client apps. The WPF development platform supports a broad set of app development features, including an app model, resources, controls, graphics, layout, data binding, documents, and security. It's a subset of the .NET Framework, so if you have previously built apps with the .NET Framework using ASP.NET or Windows Forms, the programming experience should be familiar. WPF uses the Extensible app Markup Language (XAML) to provide a declarative model for app programming, which we'll review in the coming sections.
-
-In this tutorial, you'll learn how to:
-
-> [!div class="checklist"]
-> * Create a WPF project in Visual Studio
-> * Add assemblies and NuGet packages to your project
-> * Create your app's UI with XAML
-> * Use the Translator to get languages, translate text, and detect the source language
-> * Use the Bing Spell Check API to validate your input and improve translation accuracy
-> * Run your WPF app
-
-### Cognitive Services used in this tutorial
-
-This list includes the Cognitive Services used in this tutorial. Follow the link to browse the API reference for each feature.
-
-| Service | Feature | Description |
-|||-|
-| Translator | [Get Languages](./reference/v3-0-languages.md) | Retrieve a complete list of supported languages for text translation. |
-| Translator | [Translate](./reference/v3-0-translate.md) | Translate text. |
-| Translator | [Detect](./reference/v3-0-detect.md) | Detect the language of the input text. Includes confidence score for detection. |
-| Bing Spell Check | [Spell Check](/rest/api/cognitiveservices/bing-spell-check-api-v7-reference) | Correct spelling errors to improve translation accuracy. |
-
-## Prerequisites
-
-Before we continue, you'll need the following:
-
-* An Azure Cognitive Services subscription. [Get a Cognitive Services key](../cognitive-services-apis-create-account.md#create-a-new-azure-cognitive-services-resource).
-* A Windows machine
-* [Visual Studio 2019](https://www.visualstudio.com/downloads/) - Community or Enterprise
-
-> [!NOTE]
-> We recommend creating the subscription in the West US region for this tutorial. Otherwise, you'll need to change endpoints and regions in the code as you work through this exercise.
-
-## Create a WPF app in Visual Studio
-
-The first thing we need to do is set up our project in Visual Studio.
-
-1. Open Visual Studio. Select **Create a new project**.
-1. In **Create a new project**, locate and select **WPF App (.NET Framework)**. You can select C# from **Language** to narrow the options.
-1. Select **Next**, and then name your project `MSTranslatorDemo`.
-1. Set the framework version to **.NET Framework 4.7.2** or later, and select **Create**.
- ![Enter the name and framework version in Visual Studio](media/name-wpf-project-visual-studio.png)
-
-Your project has been created. You'll notice that there are two tabs open: `MainWindow.xaml` and `MainWindow.xaml.cs`. Throughout this tutorial, we'll be adding code to these two files. We'll modify `MainWindow.xaml` for the app's user interface. We'll modify `MainWindow.xaml.cs` for our calls to Translator and Bing Spell Check.
- ![Review your environment](media/blank-wpf-project.png)
-
-In the next section, we're going to add assemblies and a NuGet package to our project for additional functionality, like JSON parsing.
-
-## Add references and NuGet packages to your project
-
-Our project requires a handful of .NET Framework assemblies and NewtonSoft.Json, which we'll install using the NuGet package manager.
-
-### Add .NET Framework assemblies
-
-Let's add assemblies to our project to serialize and deserialize objects, and to manage HTTP requests and responses.
-
-1. Locate your project in Visual Studio's Solution Explorer. Right-click your project, then select **Add > Reference**, which opens **Reference Manager**.
-1. The **Assemblies** tab lists all .NET Framework assemblies that are available to reference. Use the search bar in the upper right to search for references.
- ![Add assembly references](media/add-assemblies-2019.png)
-1. Select the following references for your project:
- * [System.Runtime.Serialization](/dotnet/api/system.runtime.serialization)
- * [System.Web](/dotnet/api/system.web)
- * System.Web.Extensions
- * [System.Windows](/dotnet/api/system.windows)
-1. After you've added these references to your project, you can click **OK** to close **Reference Manager**.
-
-> [!NOTE]
-> If you'd like to learn more about assembly references, see [How to: Add or remove reference using the Reference Manager](/visualstudio/ide/how-to-add-or-remove-references-by-using-the-reference-manager).
-
-### Install NewtonSoft.Json
-
-Our app will use NewtonSoft.Json to deserialize JSON objects. Follow these instructions to install the package.
-
-1. Locate your project in Visual Studio's Solution Explorer and right-click on your project. Select **Manage NuGet Packages**.
-1. Locate and select the **Browse** tab.
-1. Enter [NewtonSoft.Json](https://www.nuget.org/packages/Newtonsoft.Json/) into the search bar.
-
- ![Locate and install NewtonSoft.Json](media/nuget-package-manager.png)
-
-1. Select the package and click **Install**.
-1. When the installation is complete, close the tab.
-
-## Create a WPF form using XAML
-
-To use your app, you're going to need a user interface. Using XAML, we'll create a form that allows users to select input and translation languages, enter text to translate, and displays the translation output.
-
-Let's take a look at what we're building.
-
-![WPF XAML user interface](media/translator-text-csharp-xaml.png)
-
-The user interface includes these components:
-
-| Name | Type | Description |
-|||-|
-| `FromLanguageComboBox` | ComboBox | Displays a list of the languages supported by Microsoft Translator for text translation. The user selects the language they are translating from. |
-| `ToLanguageComboBox` | ComboBox | Displays the same list of languages as `FromComboBox`, but is used to select the language the user is translating to. |
-| `TextToTranslate` | TextBox | Allows the user to enter text to be translated. |
-| `TranslateButton` | Button | Use this button to translate text. |
-| `TranslatedTextLabel` | Label | Displays the translation. |
-| `DetectedLanguageLabel` | Label | Displays the detected language of the text to be translated (`TextToTranslate`). |
-
-> [!NOTE]
-> We're creating this form using the XAML source code, however, you can create the form with the editor in Visual Studio.
-
-Let's add the code to our project.
-
-1. In Visual Studio, select the tab for `MainWindow.xaml`.
-1. Copy this code into your project, and then select **File > Save MainWindow.xaml** to save your changes.
- ```xaml
- <Window x:Class="MSTranslatorDemo.MainWindow"
- xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
- xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
- xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
- xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
- xmlns:local="clr-namespace:MSTranslatorDemo"
- mc:Ignorable="d"
- Title="Microsoft Translator" Height="400" Width="700" BorderThickness="0">
- <Grid>
- <Label x:Name="label" Content="Microsoft Translator" HorizontalAlignment="Left" Margin="39,6,0,0" VerticalAlignment="Top" Height="49" FontSize="26.667"/>
- <TextBox x:Name="TextToTranslate" HorizontalAlignment="Left" Height="23" Margin="42,160,0,0" TextWrapping="Wrap" VerticalAlignment="Top" Width="600" FontSize="14" TabIndex="3"/>
- <Label x:Name="EnterTextLabel" Content="Text to translate:" HorizontalAlignment="Left" Margin="40,129,0,0" VerticalAlignment="Top" FontSize="14"/>
- <Label x:Name="toLabel" Content="Translate to:" HorizontalAlignment="Left" Margin="304,58,0,0" VerticalAlignment="Top" FontSize="14"/>
-
- <Button x:Name="TranslateButton" Content="Translate" HorizontalAlignment="Left" Margin="39,206,0,0" VerticalAlignment="Top" Width="114" Height="31" Click="TranslateButton_Click" FontSize="14" TabIndex="4" IsDefault="True"/>
- <ComboBox x:Name="ToLanguageComboBox"
- HorizontalAlignment="Left"
- Margin="306,88,0,0"
- VerticalAlignment="Top"
- Width="175" FontSize="14" TabIndex="2">
-
- </ComboBox>
- <Label x:Name="fromLabel" Content="Translate from:" HorizontalAlignment="Left" Margin="40,58,0,0" VerticalAlignment="Top" FontSize="14"/>
- <ComboBox x:Name="FromLanguageComboBox"
- HorizontalAlignment="Left"
- Margin="42,88,0,0"
- VerticalAlignment="Top"
- Width="175" FontSize="14" TabIndex="1"/>
- <Label x:Name="TranslatedTextLabel" Content="Translation is displayed here." HorizontalAlignment="Left" Margin="39,255,0,0" VerticalAlignment="Top" Width="620" FontSize="14" Height="85" BorderThickness="0"/>
- <Label x:Name="DetectedLanguageLabel" Content="Autodetected language is displayed here." HorizontalAlignment="Left" Margin="39,288,0,0" VerticalAlignment="Top" Width="620" FontSize="14" Height="84" BorderThickness="0"/>
- </Grid>
- </Window>
- ```
-You should now see a preview of the app's user interface in Visual Studio. It should look similar to the image above.
-
-That's it, your form is ready. Now let's write some code to use Text Translation and Bing Spell Check.
-
-> [!NOTE]
-> Feel free to tweak this form or create your own.
-
-## Create your app
-
-`MainWindow.xaml.cs` contains the code that controls our app. In the next few sections, we're going to add code to populate our drop-down menus, and to call a handful of API exposed by Translator and Bing Spell Check.
-
-* When the program starts and `MainWindow` is instantiated, the `Languages` method of the Translator is called to retrieve and populate our language selection drop-downs. This happens once at the beginning of each session.
-* When the **Translate** button is clicked, the user's language selection and text are retrieved, spell check is performed on the input, and the translation and detected language are displayed for the user.
- * The `Translate` method of the Translator is called to translate text from `TextToTranslate`. This call also includes the `to` and `from` languages selected using the drop-down menus.
- * The `Detect` method of the Translator is called to determine the text language of `TextToTranslate`.
- * Bing Spell Check is used to validate `TextToTranslate` and adjust misspellings.
-
-All of our project is encapsulated in the `MainWindow : Window` class. Let's start by adding code to set your subscription key, declare endpoints for Translator and Bing Spell Check, and initialize the app.
-
-1. In Visual Studio, select the tab for `MainWindow.xaml.cs`.
-1. Replace the pre-populated `using` statements with the following.
- ```csharp
- using System;
- using System.Windows;
- using System.Net;
- using System.Net.Http;
- using System.IO;
- using System.Collections.Generic;
- using System.Linq;
- using System.Text;
- using Newtonsoft.Json;
- ```
-1. Locate the `MainWindow : Window` class, and replace it with this code:
- ```csharp
- {
- // This sample uses the Cognitive Services subscription key for all services. To learn more about
- // authentication options, see: https://docs.microsoft.com/azure/cognitive-services/authentication.
- const string COGNITIVE_SERVICES_KEY = "YOUR_COG_SERVICES_KEY";
- // Endpoints for Translator and Bing Spell Check
- public static readonly string TEXT_TRANSLATION_API_ENDPOINT = "https://api.cognitive.microsofttranslator.com/{0}?api-version=3.0";
- const string BING_SPELL_CHECK_API_ENDPOINT = "https://westus.api.cognitive.microsoft.com/bing/v7.0/spellcheck/";
- // An array of language codes
- private string[] languageCodes;
-
- // Dictionary to map language codes from friendly name (sorted case-insensitively on language name)
- private SortedDictionary<string, string> languageCodesAndTitles =
- new SortedDictionary<string, string>(Comparer<string>.Create((a, b) => string.Compare(a, b, true)));
-
- // Global exception handler to display error message and exit
- private static void HandleExceptions(object sender, UnhandledExceptionEventArgs args)
- {
- Exception e = (Exception)args.ExceptionObject;
- MessageBox.Show("Caught " + e.Message, "Error", MessageBoxButton.OK, MessageBoxImage.Error);
- System.Windows.Application.Current.Shutdown();
- }
- // MainWindow constructor
- public MainWindow()
- {
- // Display a message if unexpected error is encountered
- AppDomain.CurrentDomain.UnhandledException += new UnhandledExceptionEventHandler(HandleExceptions);
-
- if (COGNITIVE_SERVICES_KEY.Length != 32)
- {
- MessageBox.Show("One or more invalid API subscription keys.\n\n" +
- "Put your keys in the *_API_SUBSCRIPTION_KEY variables in MainWindow.xaml.cs.",
- "Invalid Subscription Key(s)", MessageBoxButton.OK, MessageBoxImage.Error);
- System.Windows.Application.Current.Shutdown();
- }
- else
- {
- // Start GUI
- InitializeComponent();
- // Get languages for drop-downs
- GetLanguagesForTranslate();
- // Populate drop-downs with values from GetLanguagesForTranslate
- PopulateLanguageMenus();
- }
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- }
- ```
-1. Add your Cognitive Services subscription key and save.
-
-In this code block, we've declared two member variables that contain information about available languages for translation:
-
-| Variable | Type | Description |
-|-||-|
-|`languageCodes` | Array of strings |Caches the language codes. The Translator service uses short codes, such as `en` for English, to identify languages. |
-|`languageCodesAndTitles` | Sorted dictionary | Maps the "friendly" names in the user interface back to the short codes used in the API. Kept sorted alphabetically without regard for case. |
-
-Then, within the `MainWindow` constructor, we've added error handling with `HandleExceptions`. This error handling ensures that an alert is provided if an exception isn't handled. Then a check is run to confirm the subscription key provided is 32 characters in length. An error is thrown if the key is less than/greater than 32 characters.
-
-If there are keys that are at least the right length, the `InitializeComponent()` call gets the user interface rolling by locating, loading, and instantiating the XAML description of the main app window.
-
-Last, we've added code to call methods to retrieve languages for translation and to populate the drop-down menus for our app's user interface. Don't worry, we'll get to the code behind these calls soon.
-
-## Get supported languages
-
-We recommend calling the Languages resource exposed by the Translator rather than hardcoding the language list in your app.
-
-In this section, we'll create a `GET` request to the Languages resource, specifying that we want a list of languages available for translation.
-
-> [!NOTE]
-> The Languages resource allows you to filter language support with the following query parameters: transliteration, dictionary, and translation. For more information, see [API reference](./reference/v3-0-languages.md).
-
-Before we go any further, let's take a look at a sample output for a call to the Languages resource:
-
-```json
-{
- "translation": {
- "af": {
- "name": "Afrikaans",
- "nativeName": "Afrikaans",
- "dir": "ltr"
- },
- "ar": {
- "name": "Arabic",
- "nativeName": "العربية",
- "dir": "rtl"
- }
- // Additional languages are provided in the full JSON output.
-}
-```
-
-From this output, we can extract the language code and the `name` of a specific language. Our app uses NewtonSoft.Json to deserialize the JSON object ([`JsonConvert.DeserializeObject`](https://www.newtonsoft.com/json/help/html/M_Newtonsoft_Json_JsonConvert_DeserializeObject__1.htm)).
-
-Picking up where we left off in the last section, let's add a method to get supported languages to our app.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project:
- ```csharp
- // ***** GET TRANSLATABLE LANGUAGE CODES
- private void GetLanguagesForTranslate()
- {
- // Send request to get supported language codes
- string uri = String.Format(TEXT_TRANSLATION_API_ENDPOINT, "languages") + "&scope=translation";
- WebRequest WebRequest = WebRequest.Create(uri);
- WebRequest.Headers.Add("Accept-Language", "en");
- WebResponse response = null;
- // Read and parse the JSON response
- response = WebRequest.GetResponse();
- using (var reader = new StreamReader(response.GetResponseStream(), UnicodeEncoding.UTF8))
- {
- var result = JsonConvert.DeserializeObject<Dictionary<string, Dictionary<string, Dictionary<string, string>>>>(reader.ReadToEnd());
- var languages = result["translation"];
-
- languageCodes = languages.Keys.ToArray();
- foreach (var kv in languages)
- {
- languageCodesAndTitles.Add(kv.Value["name"], kv.Key);
- }
- }
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- ```
-
-The `GetLanguagesForTranslate()` method creates an HTTP GET request, and uses the `scope=translation` query string parameter is used to limit the scope of the request to supported languages for translation. The `Accept-Language` header with the value `en` is added so that the supported languages are returned in English.
-
-The JSON response is parsed and converted to a dictionary. Then the language codes are added to the `languageCodes` member variable. The key/value pairs that contain the language codes and the friendly language names are looped through and added to the `languageCodesAndTitles` member variable. The drop-down menus in the form display the friendly names, but the codes are needed to request the translation.
-
-## Populate language drop-down menus
-
-The user interface is defined using XAML, so you don't need to do much to set it up besides call `InitializeComponent()`. The one thing you need to do is add the friendly language names to the **Translate from** and **Translate to** drop-down menus. The `PopulateLanguageMenus()` method adds the names.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project below the `GetLanguagesForTranslate()` method:
- ```csharp
- private void PopulateLanguageMenus()
- {
- // Add option to automatically detect the source language
- FromLanguageComboBox.Items.Add("Detect");
-
- int count = languageCodesAndTitles.Count;
- foreach (string menuItem in languageCodesAndTitles.Keys)
- {
- FromLanguageComboBox.Items.Add(menuItem);
- ToLanguageComboBox.Items.Add(menuItem);
- }
-
- // Set default languages
- FromLanguageComboBox.SelectedItem = "Detect";
- ToLanguageComboBox.SelectedItem = "English";
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- ```
-
-This method iterates over the `languageCodesAndTitles` dictionary and adds each key to both menus. After the menus are populated, the default from and to languages are set to **Detect** and **English** respectively.
-
-> [!TIP]
-> Without a default selection for the menus, the user can click **Translate** without first choosing a "to" or "from" language. The defaults eliminate the need to deal with this problem.
-
-Now that `MainWindow` has been initialized and the user interface created, this code won't run until the **Translate** button is clicked.
-
-## Detect language of source text
-
-Now we're going to create method to detect the language of the source text (text entered into our text area) using the Translator. The value returned by this request will be used in our translation request later.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project below the `PopulateLanguageMenus()` method:
- ```csharp
- // ***** DETECT LANGUAGE OF TEXT TO BE TRANSLATED
- private string DetectLanguage(string text)
- {
- string detectUri = string.Format(TEXT_TRANSLATION_API_ENDPOINT ,"detect");
-
- // Create request to Detect languages with Translator
- HttpWebRequest detectLanguageWebRequest = (HttpWebRequest)WebRequest.Create(detectUri);
- detectLanguageWebRequest.Headers.Add("Ocp-Apim-Subscription-Key", COGNITIVE_SERVICES_KEY);
- detectLanguageWebRequest.Headers.Add("Ocp-Apim-Subscription-Region", "westus");
- detectLanguageWebRequest.ContentType = "application/json; charset=utf-8";
- detectLanguageWebRequest.Method = "POST";
-
- // Send request
- var serializer = new System.Web.Script.Serialization.JavaScriptSerializer();
- string jsonText = serializer.Serialize(text);
-
- string body = "[{ \"Text\": " + jsonText + " }]";
- byte[] data = Encoding.UTF8.GetBytes(body);
-
- detectLanguageWebRequest.ContentLength = data.Length;
-
- using (var requestStream = detectLanguageWebRequest.GetRequestStream())
- requestStream.Write(data, 0, data.Length);
-
- HttpWebResponse response = (HttpWebResponse)detectLanguageWebRequest.GetResponse();
-
- // Read and parse JSON response
- var responseStream = response.GetResponseStream();
- var jsonString = new StreamReader(responseStream, Encoding.GetEncoding("utf-8")).ReadToEnd();
- dynamic jsonResponse = serializer.DeserializeObject(jsonString);
-
- // Fish out the detected language code
- var languageInfo = jsonResponse[0];
- if (languageInfo["score"] > (decimal)0.5)
- {
- DetectedLanguageLabel.Content = languageInfo["language"];
- return languageInfo["language"];
- }
- else
- return "Unable to confidently detect input language.";
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- ```
-
-This method creates an HTTP `POST` request to the Detect resource. It takes a single argument, `text`, which is passed along as the body of the request. Later, we when we create our translation request, the text entered into our UI will be passed to this method for language detection.
-
-Additionally, this method evaluates the confidence score of the response. If the score is greater than `0.5`, then the detected language is displayed in our user interface.
-
-## Spell check the source text
-
-Now we're going to create a method to spell check our source text using the Bing Spell Check API. Spell checking ensures that we'll get back accurate translations from the Translator. Any corrections to the source text are passed along in our translation request when the **Translate** button is clicked.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project below the `DetectLanguage()` method:
-
-```csharp
-// ***** CORRECT SPELLING OF TEXT TO BE TRANSLATED
-private string CorrectSpelling(string text)
-{
- string uri = BING_SPELL_CHECK_API_ENDPOINT + "?mode=spell&mkt=en-US";
-
- // Create a request to Bing Spell Check API
- HttpWebRequest spellCheckWebRequest = (HttpWebRequest)WebRequest.Create(uri);
- spellCheckWebRequest.Headers.Add("Ocp-Apim-Subscription-Key", COGNITIVE_SERVICES_KEY);
- spellCheckWebRequest.Method = "POST";
- spellCheckWebRequest.ContentType = "application/x-www-form-urlencoded"; // doesn't work without this
-
- // Create and send the request
- string body = "text=" + System.Web.HttpUtility.UrlEncode(text);
- byte[] data = Encoding.UTF8.GetBytes(body);
- spellCheckWebRequest.ContentLength = data.Length;
- using (var requestStream = spellCheckWebRequest.GetRequestStream())
- requestStream.Write(data, 0, data.Length);
- HttpWebResponse response = (HttpWebResponse)spellCheckWebRequest.GetResponse();
-
- // Read and parse the JSON response; get spelling corrections
- var serializer = new System.Web.Script.Serialization.JavaScriptSerializer();
- var responseStream = response.GetResponseStream();
- var jsonString = new StreamReader(responseStream, Encoding.GetEncoding("utf-8")).ReadToEnd();
- dynamic jsonResponse = serializer.DeserializeObject(jsonString);
- var flaggedTokens = jsonResponse["flaggedTokens"];
-
- // Construct sorted dictionary of corrections in reverse order (right to left)
- // This ensures that changes don't impact later indexes
- var corrections = new SortedDictionary<int, string[]>(Comparer<int>.Create((a, b) => b.CompareTo(a)));
- for (int i = 0; i < flaggedTokens.Length; i++)
- {
- var correction = flaggedTokens[i];
- var suggestion = correction["suggestions"][0]; // Consider only first suggestion
- if (suggestion["score"] > (decimal)0.7) // Take it only if highly confident
- corrections[(int)correction["offset"]] = new string[] // dict key = offset
- { correction["token"], suggestion["suggestion"] }; // dict value = {error, correction}
- }
-
- // Apply spelling corrections, in order, from right to left
- foreach (int i in corrections.Keys)
- {
- var oldtext = corrections[i][0];
- var newtext = corrections[i][1];
-
- // Apply capitalization from original text to correction - all caps or initial caps
- if (text.Substring(i, oldtext.Length).All(char.IsUpper)) newtext = newtext.ToUpper();
- else if (char.IsUpper(text[i])) newtext = newtext[0].ToString().ToUpper() + newtext.Substring(1);
-
- text = text.Substring(0, i) + newtext + text.Substring(i + oldtext.Length);
- }
- return text;
-}
-// NOTE:
-// In the following sections, we'll add code below this.
-```
-
-## Translate text on click
-
-The last thing that we need to do is create a method that is invoked when the **Translate** button in our user interface is clicked.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-1. Add this code to your project below the `CorrectSpelling()` method and save:
- ```csharp
- // ***** PERFORM TRANSLATION ON BUTTON CLICK
- private async void TranslateButton_Click(object sender, EventArgs e)
- {
- string textToTranslate = TextToTranslate.Text.Trim();
-
- string fromLanguage = FromLanguageComboBox.SelectedValue.ToString();
- string fromLanguageCode;
-
- // auto-detect source language if requested
- if (fromLanguage == "Detect")
- {
- fromLanguageCode = DetectLanguage(textToTranslate);
- if (!languageCodes.Contains(fromLanguageCode))
- {
- MessageBox.Show("The source language could not be detected automatically " +
- "or is not supported for translation.", "Language detection failed",
- MessageBoxButton.OK, MessageBoxImage.Error);
- return;
- }
- }
- else
- fromLanguageCode = languageCodesAndTitles[fromLanguage];
-
- string toLanguageCode = languageCodesAndTitles[ToLanguageComboBox.SelectedValue.ToString()];
-
- // spell-check the source text if the source language is English
- if (fromLanguageCode == "en")
- {
- if (textToTranslate.StartsWith("-")) // don't spell check in this case
- textToTranslate = textToTranslate.Substring(1);
- else
- {
- textToTranslate = CorrectSpelling(textToTranslate);
- TextToTranslate.Text = textToTranslate; // put corrected text into input field
- }
- }
- // handle null operations: no text or same source/target languages
- if (textToTranslate == "" || fromLanguageCode == toLanguageCode)
- {
- TranslatedTextLabel.Content = textToTranslate;
- return;
- }
-
- // send HTTP request to perform the translation
- string endpoint = string.Format(TEXT_TRANSLATION_API_ENDPOINT, "translate");
- string uri = string.Format(endpoint + "&from={0}&to={1}", fromLanguageCode, toLanguageCode);
-
- System.Object[] body = new System.Object[] { new { Text = textToTranslate } };
- var requestBody = JsonConvert.SerializeObject(body);
-
- using (var client = new HttpClient())
- using (var request = new HttpRequestMessage())
- {
- request.Method = HttpMethod.Post;
- request.RequestUri = new Uri(uri);
- request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
- request.Headers.Add("Ocp-Apim-Subscription-Key", COGNITIVE_SERVICES_KEY);
- request.Headers.Add("Ocp-Apim-Subscription-Region", "westus");
- request.Headers.Add("X-ClientTraceId", Guid.NewGuid().ToString());
-
- var response = await client.SendAsync(request);
- var responseBody = await response.Content.ReadAsStringAsync();
-
- var result = JsonConvert.DeserializeObject<List<Dictionary<string, List<Dictionary<string, string>>>>>(responseBody);
- var translation = result[0]["translations"][0]["text"];
-
- // Update the translation field
- TranslatedTextLabel.Content = translation;
- }
- }
- ```
-
-The first step is to get the "from" and "to" languages, and the text the user entered into our form. If the source language is set to **Detect**, `DetectLanguage()` is called to determine the language of the source text. The text might be in a language that the Translator doesn't support. In that case, display a message to inform the user, and return without translating the text.
-
-If the source language is English (whether specified or detected), check the spelling of the text with `CorrectSpelling()` and apply any corrections. The corrected text is added back into the text area so that the user sees that a correction was made.
-
-The code to translate text should look familiar: build the URI, create a request, send it, and parse the response. The JSON array may contain more than one object for translation, however, our app only requires one.
-
-After a successful request, `TranslatedTextLabel.Content` is replaced with the `translation`, which updates the user interface to display the translated text.
-
-## Run your WPF app
-
-That's it, you have a working translation app built using WPF. To run your app, click the **Start** button in Visual Studio.
-
-## Source code
-
-Source code for this project is available on GitHub.
-
-* [Explore source code](https://github.com/MicrosoftTranslator/Text-Translation-API-V3-C-Sharp-Tutorial)
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Microsoft Translator reference](./reference/v3-0-reference.md)
cognitive-services Local Categories https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/local-categories.md
# Search categories for the Bing Local Business Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Local Business Search API enables you to search for local business entities in a variety of categories, with priority given to results close a user's location. You can include these searches in searches along with the `localCircularView` and `localMapView` [parameters](specify-geographic-search.md).
cognitive-services Local Search Query Response https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/local-search-query-response.md
# Sending and using Bing Local Business Search API queries and responses
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
You can get local results from the Bing Local Business Search API by sending a search query to its endpoint and including the `Ocp-Apim-Subscription-Key` header, which is required. Along with available [headers](local-search-reference.md#headers) and [parameters](local-search-reference.md#query-parameters), Searches can be customized by specifying [geographic boundaries](specify-geographic-search.md) for the area to be searched, and the [categories](local-search-query-response.md) of places returned.
cognitive-services Local Search Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/local-search-reference.md
# Bing Local Business Search API v7 reference
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Local Business Search API sends a search query to Bing to get results that include restaurants, hotels, or other local businesses. For places, the query can specify the name of the local business or a category (for example, restaurants near me). Entity results include persons, places, or things. Place in this context is business entities, states, countries/regions, etc.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/overview.md
# What is Bing Local Business Search?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Local Business Search API is a RESTful service that enables your applications to find information about local businesses based on search queries. For example, `q=<business-name> in Redmond, Washington`, or `q=Italian restaurants near me`. ## Features
cognitive-services Local Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API in C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in C#, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
cognitive-services Local Search Java Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-search-java-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API using Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in Java, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
cognitive-services Local Search Node Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-search-node-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API using Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in Node.js, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
cognitive-services Local Search Python Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-search-python-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API in Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in Python, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
import json
# Replace the subscriptionKey string value with your valid subscription key.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
subscriptionKey = 'YOUR-SUBSCRIPTION-KEY' host = 'api.cognitive.microsoft.com'
cognitive-services Specify Geographic Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/specify-geographic-search.md
# Use geographic boundaries to filter results from the Bing Local Business Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Local Business Search API enables you to set boundaries on the specific geographic area you'd like to search by using the `localCircularView` or `localMapView` query parameters. Be sure to use only one parameter in your queries.
cognitive-services Bing Insights Usage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/bing-insights-usage.md
Last updated 04/03/2019
# Examples of Bing insights usage
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This article contains examples of how Bing might use and display image insights on Bing.com.
cognitive-services Sending Queries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/concepts/sending-queries.md
# Sending search queries to the Bing Visual Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This article describes the parameters and attributes of requests sent to the Bing Visual Search API, as well as the response object.
cognitive-services Default Insights Tag https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/default-insights-tag.md
Last updated 04/04/2019
# Default insights tag
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The default insights tag is the one with the `displayName` field set to an empty string. The following example shows the possible list of default insights (actions). The list of actions the response includes depends on the image. And for each action, the list of properties may vary by image, so check if the property exists before trying to use it.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/language-support.md
Last updated 09/25/2018
# Language and region support for the Bing Visual Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Visual Search API supports more than three dozen countries/regions, many with more than one language. Each request should include the user's country/region and language of choice. Knowing the user's market helps Bing return appropriate results. If you don't specify a country/region and language, Bing makes a best effort to determine the user's country/region and language. Because the results may contain links to Bing, knowing the country/region and language may provide a preferred localized Bing user experience if the user clicks the Bing links.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/overview.md
Last updated 12/19/2019
# What is the Bing Visual Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search API returns insights for an image. You can either upload an image or provide a URL to one. Insights are visually similar images, shopping sources, webpages that include the image, and more. Insights returned by the Bing Visual Search API are similar to ones shown on Bing.com/images.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Visual Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/csharp.md
# Quickstart: Get image insights using the Bing Visual Search REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This quickstart demonstrates how to upload an image to the Bing Visual Search API and view the insights that it returns.
cognitive-services Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/go.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Go
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API using the Go programming language. A POST request uploads an image to the API endpoint. The results include URLs and descriptive information about images similar to the uploaded image.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/java.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API. This Java application uploads an image to the API and displays the information it returns. Although this application is written in Java, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/nodejs.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API. This simple JavaScript application uploads an image to the API, and displays the information returned about it. Although this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/python.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API. This Python application uploads an image to the API and displays the information it returns. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/ruby.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API using the Ruby programming language. A POST request uploads an image to the API endpoint. The results include URLs and descriptive information about images similar to the uploaded image.
cognitive-services Tutorial Bing Visual Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-bing-visual-search-single-page-app.md
# Tutorial: Create a Visual Search single-page web app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search API returns insights for an image. You can either upload an image or provide a URL to one. Insights are visually similar images, shopping sources, webpages that include the image, and more. Insights returned by the Bing Visual Search API are similar to ones shown on Bing.com/images.
cognitive-services Tutorial Visual Search Crop Area Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-visual-search-crop-area-results.md
# Tutorial: Crop an image with the Bing Visual Search SDK for C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search SDK enables you to crop an image before finding similar online images. This application crops a single person from an image containing several people, and then returns search results containing similar images found online.
cognitive-services Tutorial Visual Search Image Upload https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-visual-search-image-upload.md
# Tutorial: Upload images to the Bing Visual Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search API enables you to search the web for images similar to ones you upload. Use this tutorial to create a web application that can send an image to the API, and display the insights it returns within the webpage. Note that this application does not adhere to all [Bing Use and Display Requirements](../bing-web-search/use-display-requirements.md) for using the API.
cognitive-services Tutorial Visual Search Insights Token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-visual-search-insights-token.md
# Tutorial: Find similar images from previous searches using an image insights token
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Visual Search client library enables you to find images online from previous searches that return an `ImageInsightsToken`. This application gets an `ImageInsightsToken` and uses the token in a subsequent search. It then sends the `ImageInsightsToken` to Bing and returns results that include Bing Search URLs and URLs of similar images found online.
cognitive-services Use Insights Token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/use-insights-token.md
# Use an insights token to get insights for an image
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Visual Search API returns information about an image that you provide. You can provide the image by using the URL of the image, an insights token, or by uploading an image. For information about these options, see [What is Bing Visual Search API?](overview.md). This article demonstrates using an insights token. For examples that demonstrate how to upload an image to get insights, see the quickstarts:
To run this application, follow these steps:
# Download and install Python at https://www.python.org/
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
# Run the following in a command console window
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
# pip3 install requests
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
- import requests import json
HEADERS = {'Ocp-Apim-Subscription-Key': SUBSCRIPTION_KEY}
# To get an insights, call the /images/search endpoint. Get the token from
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
# the imageInsightsToken field in the Image object.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
insightsToken = 'ccid_tmaGQ2eU*mid_D12339146CFEDF3D409CC7A66D2C98D0D71904D4*simid_608022145667564759*thid_OIP.tmaGQ2eUI1yq3yll!_jn9kwHaFZ' formData = '{"imageInfo":{"imageInsightsToken":"' + insightsToken + '"}}'
def print_json(obj):
# Main execution
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
if __name__ == '__main__': main() ```
if __name__ == '__main__':
[Create a Visual Search single-page web app](tutorial-bing-visual-search-single-page-app.md) [What is the Bing Visual Search API?](overview.md) [Try Cognitive Services](https://aka.ms/bingvisualsearchtryforfree)
-[Images - Visual Search](/rest/api/cognitiveservices/bingvisualsearch/images/visualsearch)
+[Images - Visual Search](/rest/api/cognitiveservices/bingvisualsearch/images/visualsearch)
cognitive-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/conversational-language-understanding/how-to/train-model.md
The training times can be anywhere from a few seconds when dealing with orchestr
## Train model
-Enter a new model name or select an existing model from the **Model Name** dropdown. Press the enter key after you add a model name. Select whether you want to evaluate your model by changing the **Run evaluation with training** toggle. If enabled, your tagged utterances will be spilt into 3 parts; 80% for training, 10% for validation and 10% for testing. Afterwards, you'll be able to see the model's evaluation results.
+Select **Train model** on the left of the screen. Select **Start a training job** from the top menu.
+
+Enter a new model name or select an existing model from the **Model Name** dropdown.
+
+Select whether you want to evaluate your model by changing the **Run evaluation with training** toggle. If enabled, your tagged utterances will be spilt into 2 parts; 80% for training, 20% for testing. Afterwards, you'll be able to see the model's evaluation results.
:::image type="content" source="../media/train-model.png" alt-text="A screenshot showing the Train model page for Conversational Language Understanding projects." lightbox="../media/train-model.png":::
-Click the **Train** button and wait for training to complete. You will see the training status of your model in the view model details page.
+Click the **Train** button and wait for training to complete. You will see the training status of your model in the view model details page. Only successfully completed tasks will generate models.
## Evaluate model
cognitive-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/custom-classification/service-limits.md
Custom text classification is only available select Azure regions. When you crea
* You can't rename your project after creation.
+* Your project name must only contain alphanumeric characters (letters and numbers). Spaces and special characters are not allowed. Project names can have a maximum of 50 characters.
+ * You must have minimum of 10 files in your project and a maximum of 1,000,000 files. * You can have up to 10 trained models per project. * Model names have to be unique within the same project.
+* Model name must only contain alphanumeric characters (letters and numbers). Spaces and special characters are not allowed. Project names can have a maximum of 50 characters.
+ * You can't rename your model after creation. * You can only train one model at a time per project.
Custom text classification is only available select Azure regions. When you crea
| Attribute | Limits | |--|--|
-| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
-| Model name | You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]` |
-| entity names| You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]` |
-| File names | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
+| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. Maximum allowed length is 50 characters. |
+| Model name | You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]`. Maximum allowed length is 50 characters. |
+| Class name| You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]`. Maximum allowed length is 50 characters. |
+| File name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
cognitive-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/custom-named-entity-recognition/service-limits.md
Use this article to learn about the data and service limits when using Custom NE
* All files should be available at the root of your container.
-* Maximum allowed length for your file sis 128,000 characters, which is approximately 28,000 words or 56 pages.
+* Maximum allowed length for your file is 128,000 characters, which is approximately 28,000 words or 56 pages.
* Your [training dataset](how-to/train-model.md#data-split) should include at least 10 files and not more than 100,000 files. - ## APIs limits * When using the Authoring API, there is a maximum of 10 POST requests and 100 GET requests per minute.
Custom text classification is only available select Azure regions. When you crea
## Project limits
-* You can only connect 1 storage container for each project. This process is irreversible. If you connect a container to your project, you cannot disconnect it later.
+* You can only connect 1 storage account for each project. This process is irreversible. If you connect a storage account to your project, you cannot disconnect it later.
* You can only have 1 [tags file](how-to/tag-data.md) per project. You cannot change to a different tags file later. You can only update the tags within your project. * You cannot rename your project after creation.
+* Your project name must only contain alphanumeric characters (letters and numbers). Spaces and special characters are not allowed. Project names can have a maximum of 50 characters.
+ * You must have minimum of 10 tagged files in your project and a maximum of 100,000 files.
-* You can have up to 10 trained models per project.
+* You can have up to 50 trained models per project.
* Model names have to be unique within the same project.
+* Model names must only contain alphnumeric characters,only letters and numbers, no spaces or special characters are allowed). Model name must have a maximum of 50 characters.
+ * You cannot rename your model after creation. * You can only train one model at a time per project.
Custom text classification is only available select Azure regions. When you crea
* It is recommended to have around 200 tagged instances per entity and you must have a minimum of 10 of tagged instances per entity.
+* Entity names must have a maximum of 50 characters.
+ ## Naming limits | Attribute | Limits | |--|--|
-| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
-| Model name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
-| Entity names| You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]` |
-| File names | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
+| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. Maximum length allowed is 50 characters. |
+| Model name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. Maximum length allowed is 50 characters. |
+| Entity name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]`. Maximum length allowed is 50 characters. |
+| File name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
## Next steps
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/overview.md
Previously updated : 11/02/2021 Last updated : 02/01/2022
Azure Cognitive Service for Language provides the following features:
> [!div class="mx-tdCol2BreakAll"] > |Feature |Description | Deployment options| > ||||
-> | [Named Entity Recognition (NER)](named-entity-recognition/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](named-entity-recognition/quickstart.md) |
-> | [Personally Identifiable Information (PII) detection](personally-identifiable-information/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories of sensitive information, such as account information. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](named-entity-recognition/quickstart.md) |
-> | [Key phrase extraction](key-phrase-extraction/overview.md) | This pre-configured feature evaluates unstructured text, and for each input document, returns a list of key phrases and main points in the text. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](key-phrase-extraction/quickstart.md) <br> ΓÇó [Docker container](key-phrase-extraction/how-to/use-containers.md) |
-> |[Entity linking](entity-linking/overview.md) | This pre-configured feature disambiguates the identity of an entity found in text and provides links to the entity on Wikipedia. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](entity-linking/quickstart.md) |
-> | [Text Analytics for health](text-analytics-for-health/overview.md) | This pre-configured feature extracts information from unstructured medical texts, such as clinical notes and doctor's notes. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](text-analytics-for-health/quickstart.md) <br> ΓÇó [Docker container](text-analytics-for-health/how-to/use-containers.md) |
-> | [Custom NER](custom-named-entity-recognition/overview.md) | Build an AI model to extract custom entity categories, using unstructured text that you provide. | ΓÇó [Language Studio](custom-named-entity-recognition/quickstart.md?pivots=language-studio) <br> ΓÇó [REST API](custom-named-entity-recognition/quickstart.md?pivots=rest-api) |
-> | [Analyze sentiment and opinions](sentiment-opinion-mining/overview.md) | This pre-configured feature provides sentiment labels (such as "*negative*", "*neutral*" and "*positive*") for sentences and documents. This feature can additionally provide granular information about the opinions related to words that appear in the text, such as the attributes of products or services. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](sentiment-opinion-mining/quickstart.md) <br> ΓÇó [Docker container](sentiment-opinion-mining/how-to/use-containers.md)
-> |[Language detection](language-detection/overview.md) | This pre-configured feature evaluates text, and determines the language it was written in. It returns a language identifier and a score that indicates the strength of the analysis. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](language-detection/quickstart.md) <br> ΓÇó [Docker container](language-detection/how-to/use-containers.md) |
-> |[Custom text classification (preview)](custom-classification/overview.md) | Build an AI model to classify unstructured text into custom classes that you define. | ΓÇó [Language Studio](custom-classification/quickstart.md?pivots=language-studio)<br> ΓÇó [REST API](language-detection/quickstart.md?pivots=rest-api) |
-> | [Text Summarization (preview)](text-summarization/overview.md) | This pre-configured feature extracts key sentences that collectively convey the essence of a document. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](text-summarization/quickstart.md) |
-> | [Conversational language understanding (preview)](conversational-language-understanding/overview.md) | Build an AI model to bring the ability to understand natural language into apps, bots, and IoT devices. | ΓÇó [Language Studio](conversational-language-understanding/quickstart.md)
-> | [Question answering](question-answering/overview.md) | This pre-configured feature provides answers to questions extracted from text input, using semi-structured content such as: FAQs, manuals, and documents. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](question-answering/quickstart/sdk.md) |
+> | [Named Entity Recognition (NER)](named-entity-recognition/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](named-entity-recognition/quickstart.md) |
+> | [Personally Identifiable Information (PII) detection](personally-identifiable-information/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories of sensitive information, such as account information. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](personally-identifiable-information/quickstart.md) |
+> | [Key phrase extraction](key-phrase-extraction/overview.md) | This pre-configured feature evaluates unstructured text, and for each input document, returns a list of key phrases and main points in the text. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](key-phrase-extraction/quickstart.md) <br> * [Docker container](key-phrase-extraction/how-to/use-containers.md) |
+> |[Entity linking](entity-linking/overview.md) | This pre-configured feature disambiguates the identity of an entity found in text and provides links to the entity on Wikipedia. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](entity-linking/quickstart.md) |
+> | [Text Analytics for health](text-analytics-for-health/overview.md) | This pre-configured feature extracts information from unstructured medical texts, such as clinical notes and doctor's notes. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](text-analytics-for-health/quickstart.md) <br> * [Docker container](text-analytics-for-health/how-to/use-containers.md) |
+> | [Custom NER](custom-named-entity-recognition/overview.md) | Build an AI model to extract custom entity categories, using unstructured text that you provide. | * [Language Studio](custom-named-entity-recognition/quickstart.md?pivots=language-studio) <br> * [REST API](custom-named-entity-recognition/quickstart.md?pivots=rest-api) |
+> | [Analyze sentiment and opinions](sentiment-opinion-mining/overview.md) | This pre-configured feature provides sentiment labels (such as "*negative*", "*neutral*" and "*positive*") for sentences and documents. This feature can additionally provide granular information about the opinions related to words that appear in the text, such as the attributes of products or services. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](sentiment-opinion-mining/quickstart.md) <br> * [Docker container](sentiment-opinion-mining/how-to/use-containers.md)
+> |[Language detection](language-detection/overview.md) | This pre-configured feature evaluates text, and determines the language it was written in. It returns a language identifier and a score that indicates the strength of the analysis. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](language-detection/quickstart.md) <br> * [Docker container](language-detection/how-to/use-containers.md) |
+> |[Custom text classification (preview)](custom-classification/overview.md) | Build an AI model to classify unstructured text into custom classes that you define. | * [Language Studio](custom-classification/quickstart.md?pivots=language-studio)<br> * [REST API](language-detection/quickstart.md?pivots=rest-api) |
+> | [Text Summarization (preview)](text-summarization/overview.md) | This pre-configured feature extracts key sentences that collectively convey the essence of a document. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](text-summarization/quickstart.md) |
+> | [Conversational language understanding (preview)](conversational-language-understanding/overview.md) | Build an AI model to bring the ability to understand natural language into apps, bots, and IoT devices. | * [Language Studio](conversational-language-understanding/quickstart.md)
+> | [Question answering](question-answering/overview.md) | This pre-configured feature provides answers to questions extracted from text input, using semi-structured content such as: FAQs, manuals, and documents. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](question-answering/quickstart/sdk.md) |
## Tutorials
After you've had a chance to get started with the Language service, try our tuto
* [Extract key phrases from text stored in Power BI](key-phrase-extraction/tutorials/integrate-power-bi.md) * [Use Power Automate to sort information in Microsoft Excel](named-entity-recognition/tutorials/extract-excel-information.md)
-* [Use Flask to translate text, analyze sentiment, and synthesize speech](../translator/tutorial-build-flask-app-translation-synthesis.md?context=%2fazure%2fcognitive-services%2flanguage-service%2fcontext%2fcontext)
+* [Use Flask to translate text, analyze sentiment, and synthesize speech](/learn/modules/python-flask-build-ai-web-app/)
* [Use Cognitive Services in canvas apps](/powerapps/maker/canvas-apps/cognitive-services-api?context=/azure/cognitive-services/language-service/context/context) * [Create a FAQ Bot](question-answering/tutorials/bot-service.md)
An AI system includes not only the technology, but also the people who will use
* [Transparency note for the Language service](/legal/cognitive-services/text-analytics/transparency-note) * [Integration and responsible use](/legal/cognitive-services/text-analytics/guidance-integration-responsible-use)
-* [Data, privacy, and security](/legal/cognitive-services/text-analytics/data-privacy)
+* [Data, privacy, and security](/legal/cognitive-services/text-analytics/data-privacy)
cognitive-services Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/question-answering/concepts/best-practices.md
Question answering takes casing into account but it's intelligent enough to unde
### How are question answer pairs prioritized for multi-turn questions?
-When a knowledge base has hierarchical relationships (either added manually or via extraction) and the previous response was an answer related to other question answer pairs, for the next query we give slight preference to all the children question answer pairs, sibling question answer pairs, and grandchildren question answer pairs in that order. Along with any query, the [Question Answering REST API](https://docs.microsoft.com/rest/api/cognitiveservices/questionanswering/question-answering/get-answers) expects a `context` object with the property `previousQnAId`, which denotes the last top answer. Based on this previous `QnAID`, all the related `QnAs` are boosted.
+When a knowledge base has hierarchical relationships (either added manually or via extraction) and the previous response was an answer related to other question answer pairs, for the next query we give slight preference to all the children question answer pairs, sibling question answer pairs, and grandchildren question answer pairs in that order. Along with any query, the [Question Answering REST API](/rest/api/cognitiveservices/questionanswering/question-answering/get-answers) expects a `context` object with the property `previousQnAId`, which denotes the last top answer. Based on this previous `QnAID`, all the related `QnAs` are boosted.
### How are accents treated?
cognitive-services What Are Cognitive Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/what-are-cognitive-services.md
Title: What are Azure Cognitive Services?
-description: Cognitive Services makes AI accessible to every developer without requiring machine-learning and data-science expertise. You just need to make an API call from your application to add the ability to see (advanced image search and recognition), hear, speak, search, and decision-making into your apps.
+description: Cognitive Services makes AI accessible to every developer without requiring machine-learning and data-science expertise. You need to make an API call from your application to add the ability to see (advanced image search and recognition), hear, speak, search, and decision-making into your apps.
keywords: cognitive services, cognitive intelligence, cognitive solutions, ai services, cognitive understanding, cognitive features Previously updated : 01/05/2022 Last updated : 01/31/2022 # What are Azure Cognitive Services?
-Azure Cognitive Services are cloud-based services with REST APIs and client library SDKs available to help you build cognitive intelligence into your applications. You can add cognitive features to your applications without having artificial intelligence (AI) or data science skills. Azure Cognitive Services comprise various AI services that enable you to build cognitive solutions that can see, hear, speak, understand, and even make decisions.
+Azure Cognitive Services are cloud-based services with REST APIs, client library SDKs, and user interfaces available to help you build cognitive intelligence into your applications. You can add cognitive features to your applications without having artificial intelligence (AI) or data science skills. Cognitive Services comprises various AI services that enable you to build cognitive solutions that can see, hear, speak, understand, and even make decisions.
## Categories of Cognitive Services
-The catalog of cognitive services that provide cognitive understanding is categorized into four main pillars:
+Cognitive Services can be categorized into four main pillars:
* Vision * Speech * Language * Decision
-The following sections in this article provide a list of services that are part of these four pillars.
- ## Vision APIs |Service Name|Service Description| |:--|:| |[Computer Vision](./computer-vision/index.yml "Computer Vision")|The Computer Vision service provides you with access to advanced cognitive algorithms for processing images and returning information. See [Computer Vision quickstart](./computer-vision/quickstarts-sdk/client-library.md) to get started with the service.|
-|[Custom Vision Service](./custom-vision-service/index.yml "Custom Vision Service")|The Custom Vision Service lets you build, deploy, and improve your own image classifiers. An image classifier is an AI service that applies labels to images, based on their visual characteristics. |
+|[Custom Vision](./custom-vision-service/index.yml "Custom Vision Service")|The Custom Vision Service lets you build, deploy, and improve your own image classifiers. An image classifier is an AI service that applies labels to images, based on their visual characteristics. |
|[Face](./face/index.yml "Face")| The Face service provides access to advanced face algorithms, enabling face attribute detection and recognition. See [Face quickstart](./face/quickstarts/client-libraries.md) to get started with the service.| ## Speech APIs
The following sections in this article provide a list of services that are part
|[Bing Speech](./speech-service/how-to-migrate-from-bing-speech.md "Bing Speech") (Retiring)|The Bing Speech API provides you with an easy way to create speech-enabled features in your applications.| |[Translator Speech](/azure/cognitive-services/translator-speech/ "Translator Speech") (Retiring)|Translator Speech is a machine translation service.| -->+ ## Language APIs |Service Name|Service Description| |:--|:|
-|[Azure Cognitive Service for language](./language-service/index.yml "Language service")| Azure Cognitive Service for Language provides several Natural Language Processing (NLP) features for understanding and analyzing text.|
-|[Language Understanding LUIS](./luis/index.yml "Language Understanding")|Language Understanding (LUIS) is a cloud-based conversational AI service that applies custom machine-learning intelligence to a user's conversational, natural language text to predict overall meaning, and pull out relevant, detailed information. [See LUIS quickstart](./luis/luis-get-started-create-app.md) to get started with the service.|
+|[Azure Cognitive Service for language](./language-service/index.yml "Language service")| Azure Cognitive Service for Language provides several Natural Language Processing (NLP) features to understand and analyze text.|
+|[Translator](./translator/index.yml "Translator")|Translator provides machine-based text translation in near real time.|
+|[Language Understanding LUIS](./luis/index.yml "Language Understanding")|Language Understanding (LUIS) is a cloud-based conversational AI service that applies custom machine-learning intelligence to a user's conversational or natural language text to predict overall meaning and pull out relevant information. [See LUIS quickstart](./luis/luis-get-started-create-app.md) to get started with the service.|
|[QnA Maker](./qnamaker/index.yml "QnA Maker")|QnA Maker allows you to build a question and answer service from your semi-structured content. [See QnA Maker quickstart](./qnamaker/quickstarts/create-publish-knowledge-base.md) to get started with the service.|
-|[Translator](./translator/index.yml "Translator")|Translator provides machine-based text translation in near real-time.|
## Decision APIs
Start by creating a Cognitive Services resource with hands-on quickstarts using
* [Azure portal](cognitive-services-apis-create-account.md?tabs=multiservice%2Cwindows "Azure portal") * [Azure CLI](cognitive-services-apis-create-account-cli.md?tabs=windows "Azure CLI") * [Azure SDK client libraries](cognitive-services-apis-create-account-cli.md?tabs=windows "cognitive-services-apis-create-account-client-library?pivots=programming-language-csharp")
-* [Azure Resource Manager (ARM) templates](./create-account-resource-manager-template.md?tabs=portal "Azure Resource Manager (ARM) templates")
+* [Azure Resource Manager (ARM template)](./create-account-resource-manager-template.md?tabs=portal "Azure Resource Manager (ARM template)")
## Using Cognitive Services in different development environments
All APIs have a free tier, which has usage and throughput limits. You can incre
## Using Cognitive Services securely
-Azure Cognitive Services provides a layered security model, including [authentication](authentication.md "authentication") via Azure Active Directory credentials, a valid resource key, and [Azure Virtual Networks](cognitive-services-virtual-networks.md "Azure Virtual Networks").
+Azure Cognitive Services provides a layered security model, including [authentication](authentication.md "Authentication") via Azure Active Directory credentials, a valid resource key, and [Azure Virtual Networks](cognitive-services-virtual-networks.md "Azure Virtual Networks").
## Containers for Cognitive Services
- Azure Cognitive Services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Cognitive Services closer to your data for compliance, security or other operational reasons. Learn more about [Cognitive Services Containers](cognitive-services-container-support.md "Cognitive Services Containers").
+ Azure Cognitive Services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Cognitive Services closer to your data for compliance, security, or other operational reasons. For more information, see [Cognitive Services Containers](cognitive-services-container-support.md "Cognitive Services Containers").
## Regional availability
Looking for a region we don't support yet? Let us know by filing a feature reque
## Supported cultural languages
-Cognitive Services supports a wide range of cultural languages at the service level. You can find the language availability for each API in the [supported languages list](language-support.md "supported languages list").
+Cognitive Services supports a wide range of cultural languages at the service level. You can find the language availability for each API in the [supported languages list](language-support.md "Supported languages list").
## Certifications and compliance
-Cognitive Services has been awarded certifications such as CSA STAR Certification, FedRAMP Moderate, and HIPAA BAA. You can [download](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942 "download") certifications for your own audits and security reviews.
+Cognitive Services has been awarded certifications such as CSA STAR Certification, FedRAMP Moderate, and HIPAA BAA. You can [download](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942 "Download") certifications for your own audits and security reviews.
To understand privacy and data management, go to the [Trust Center](https://servicetrust.microsoft.com/ "Trust Center").
communication-services Custom Teams Endpoint Authentication Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/interop/custom-teams-endpoint-authentication-overview.md
+
+ Title: Authentication of custom Teams endpoint
+
+description: This article discusses authentication of a custom Teams endpoint.
+++++ Last updated : 06/30/2021+++++
+# Authentication flow cases
+
+Azure Communication Services provides developers the ability to build custom Teams calling experience with Communication Services calling software development kit (SDK). This article provides insights into the process of authentication and describes individual authentication artifacts. In the following use cases, we'll demonstrate authentication for single and multi-tenant Azure Active Directory (Azure AD) applications.
+
+## Case 1: Single-tenant application
+The following scenario shows an example of the company Fabrikam, which has built custom Teams calling application for internal use within a company. All Teams users are managed by Azure Active Directory. The access to the Azure Communication Services is controlled via Azure role-based access control (Azure RBAC).
++
+![Diagram of the process for authenticating Teams user for accessing Fabrikam client application and Fabrikam Azure Communication Services resource.](./media/custom-teams-endpoint/authentication-case-single-tenant-azure-rbac-overview.svg)
+
+The following sequence diagram is showing detailed steps of the authentication:
++
+Prerequisites:
+- Alice or her Azure AD Administrator needs to provide consent to the Fabrikam's Azure Active Directory Application before first sign in. To learn more about [consent flow](https://docs.microsoft.com/azure/active-directory/develop/consent-framework).
+- The admin of the Azure Communication Services resource must grant Alice permission to perform this action. You can learn about the [Azure RBAC role assignment](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-portal).
+
+Steps:
+1. Authentication of Alice from Fabrikam against Fabrikam's Azure Active Directory: This step is standard OAuth flow leveraging Microsoft Authentication Library (MSAL) to authenticate against Fabrikam's Azure Active Directory. Alice is authenticating for Fabrikam's Azure AD application. If the authentication of Alice is successful, Fabrikam's Client application receives Azure AD access token 'A'. Details of the token are captured below. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
+1. Get access token for Alice: This flow is initiated from the Fabrikam's Client application and performs control plane logic authorized by artifact 'A' to retrieve Fabrikam's Azure Communication Services access token 'D' for Alice. Details of the token are captured below. This access token can be used for data plane actions in Azure Communication Services such as calling. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
+1. Start a call to Bob from Fabrikam: Alice is using Azure Communication Services access token to make a call to Teams user Bob via Communication Services calling SDK. You can learn more about the [developer experience in the quickstart](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
+
+Artifacts:
+- Artifact A
+ - Type: Azure AD access token
+ - Audience: _`Azure Communication Services`_ ΓÇö control plane
+ - Azure AD application ID: Fabrikam's _`Azure AD application ID`_
+ - Permission: _`https://auth.msft.communication.azure.com/Teams.ManageCalls`_
+- Artifact D
+ - Type: Azure Communication Services access token
+ - Audience: _`Azure Communication Services`_ ΓÇö data plane
+ - Azure Communication Services Resource ID: Fabrikam's _`Azure Communication Services Resource ID`_
+
+## Case 2: Multi-tenant application
+The following scenario shows an example of company Contoso, which has built custom Teams calling application for external customers, such as the company Fabrikam. Contoso infrastructure uses custom authentication within the Contoso infrastructure. Contoso infrastructure is using a connection string to retrieve the token for Fabrikam's Teams user.
+
+![Diagram of the process for authenticating Fabrikam Teams user for accessing Contoso client application and Contoso Azure Communication Services resource.](./media/custom-teams-endpoint/authentication-case-multiple-tenants-hmac-overview.svg)
+
+The following sequence diagram is showing detailed steps of the authentication:
++
+Prerequisites:
+- Alice or her Azure AD Administrator needs to provide consent to the Contoso's Azure Active Directory Application before first sign in. To learn more about [consent flow](https://docs.microsoft.com/azure/active-directory/develop/consent-framework).
+
+Steps:
+1. Authentication of Alice from Fabrikam against Fabrikam's Azure Active Directory: This step is standard OAuth flow using Microsoft Authentication Library (MSAL) to authenticate against Fabrikam's Azure Active Directory. Alice is authenticating for Contoso's Azure AD application. If the authentication of Alice is successful, Contoso's Client application receives Azure AD access token 'A'. Details of the token are captured below. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
+1. Get access token for Alice: This flow is initiated from Contoso's client application and performs control plane logic authorized by artifact 'A' to retrieve Contoso's Azure Communication Services access token 'D' for Alice. Details of the token are captured below. This access token can be used for data plane actions in Azure Communication Services such as calling. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md). (https://docs.microsoft.com/azure/role-based-access-control/role-assignments-portal).
+1. Start a call to Bob from Fabrikam: Alice is using Azure Communication Services access token to make a call to Teams user Bob via Communication Services calling SDK. You can learn more about the [developer experience in the quickstart](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
++
+Artifacts:
+- Artifact A
+ - Type: Azure AD access token
+ - Audience: Azure Communication Services ΓÇö control plane
+ - Azure AD application ID: Contoso's _`Azure AD application ID`_
+ - Permission: _`https://auth.msft.communication.azure.com/Teams.ManageCalls`_
+- Artifact B
+ - Type: Custom Contoso authentication artifact
+- Artifact C
+ - Type: Hash-based Message Authentication Code (HMAC) (based on Contoso's _`connection string`_)
+- Artifact D
+ - Type: Azure Communication Services access token
+ - Audience: _`Azure Communication Services`_ ΓÇö data plane
+ - Azure Communication Services Resource ID: Contoso's _`Azure Communication Services Resource ID`_
+
+## Next steps
+
+The following articles might be of interest to you:
+
+- Learn more about [authentication](../authentication.md).
+- Try [quickstart for authentication of Teams users](../../quickstarts/manage-teams-identity.md).
+- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
communication-services Custom Teams Endpoint Firewall Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/interop/custom-teams-endpoint-firewall-configuration.md
+
+ Title: Firewall configuration
+
+description: This article describes firewall configuration requirements to enable a custom Teams endpoint.
+++++ Last updated : 06/30/2021+++++
+# Firewall configuration
+
+Azure Communication Services provides the ability to leverage Communication Services calling Software development kit (SDK) to build custom Teams calling experience. To enable this experience, Administrators need to configure the firewall according to Communication Services and Microsoft Teams guidance. Communication Services requirements allow control plane, and Teams requirements allow calling experience. If an independent software vendor (ISV) provides the authentication experience, then instead of Communication Services configuration, use configuration guidance of the vendor.
+
+The following articles might be of interest to you:
+
+- Learn more about [Azure Communication Services firewall configuration](../voice-video-calling/network-requirements.md).
+- Learn about [Microsoft Teams firewall configuration](https://docs.microsoft.com/microsoft-365/enterprise/urls-and-ip-address-ranges?view=o365-worldwide#skype-for-business-online-and-microsoft-teams).
communication-services Custom Teams Endpoint Use Cases https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/interop/custom-teams-endpoint-use-cases.md
+
+ Title: Use cases for custom Teams endpoint
+
+description: This article describes use cases for a custom Teams endpoint.
+++++ Last updated : 06/30/2021+++++
+# Custom Teams Endpoint ΓÇö Use cases
+
+Microsoft Teams provides identities managed by Azure Active Directory and calling experiences controlled by Teams Admin Center and policies. Users might have assigned licenses to enable PSTN connectivity and advanced calling capabilities of Teams Phone System. Azure Communication Services are supporting Teams identities for managing Teams VoIP calls, Teams PSTN calls, and join Teams meetings. Developers might extend the Azure Communication Services with Graph API to provide contextual data from Microsoft 365 ecosystem. This page is providing inspiration on how to use existing Microsoft technologies to provide an end-to-end experience for calling scenarios with Teams users and Azure Communication Services calling SDKs.
+
+## Use case 1: Make outbound Teams PSTN call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to make Teams PSTN calls via a custom website that takes the identity of the Teams user and configuration of the PSTN connectivity assigned to that Teams user.
+
+![Diagram is showing user experience of Alice making Teams PSTN call to customer Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-pstn-out-overview.svg)
+
+The following sequence diagram shows detailed steps of initiation of a Teams PSTN call:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Load customers and their PSTN numbers: Contoso provides custom logic to retrieve the list of customers and their associated phone numbers. This list is rendered on the initial page to Alice.
+3. Initiate a call to Megan: Alice selects a button to initiate a PSTN call to Megan in the Contoso's Client application. Client application uses Azure Communication Services calling SDK to provide the calling capability. First, it creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then you need to start a call to Megan's phone number.
+
+```js
+const pstnCallee = { phoneNumber: '<MEGAN_PHONE_NUMBER_E164_FORMAT>' }
+const oneToOneCall = callAgent.startCall([pstnCallee], { threadId: '00000000-0000-0000-0000-000000000000' });
+```
+4. Connecting PSTN call to Megan: The call is routed through the Teams PSTN connectivity assigned to Alice, reaching the PSTN network and ringing the phone associated with the provided phone number. Megan sees an incoming call from the phone number associated with Alice's Teams user.
+5. Megans accepts the call: Megan accepts the call and the connection between Alice and Megan is established.
+
+## Use case 2: Receive inbound Teams PSTN call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to receive a Teams PSTN call via a custom website that takes the identity of the Teams user and configuration of the PSTN connectivity assigned to that Teams user.
+
+![Diagram is showing user experience of Alice receiving Teams PSTN call from customer Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-pstn-in-overview.svg)
+
+The following sequence diagram shows detailed steps for accepting incoming Teams PSTN calls:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Subscribe for receiving calls: Client application uses Azure Communication Services calling SDK to provide the calling capability. First, it creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then you subscribe to the incoming call event.
+
+```js
+const incomingCallHandler = async (args: { incomingCall: IncomingCall }) => {
+ // Get information about caller
+ var callerInfo = incomingCall.callerInfo
+
+ showIncomingCall(callerInfo,incomingCall);
+};
+callAgent.on('incomingCall', incomingCallHandler);
+```
+The method _showIncomingCall_ is a custom Contoso's method that will render a user interface to indicate incoming calls and two buttons to accept and decline the call. If you select accept button, then the following code is used:
+
+```js
+// Accept the call
+var call = await incomingCall.accept();
+```
+
+If you select the decline button, then the following code is used:
++
+```js
+// Reject the call
+incomingCall.reject();
+```
+
+3. Megan start's a call to PSTN number assigned to Teams user Alice: Megan uses her phone to call Alice. The carrier network will connect to Teams PSTN connectivity assigned to Alice and it will ring all Teams endpoints registered for Alice. It includes: Teams desktop, mobile, web clients, and applications based on Azure Communication Services calling SDK.
+4. Contoso's client application shows Megan's incoming call: Client application receives incoming call notification. _showIncomingCall_ method would use custom Contoso's logic to translate the phone number to customer's name (for example, a database storing key-value pairs consisting of a phone number and customer name). When the information is retrieved, the notification is shown to Alice in Contoso's client application.
+5. Alice accepts the call: Alice selects a button to accept the call and the connection between Alice and Megan is established.
++
+## Use case 3: Make outbound Teams VoIP call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to make Teams VoIP calls via a custom website that takes the identity of the Teams user.
+
+![Diagram is showing user experience of Alice making Teams VoIP call to colleague Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-voip-out-overview.svg)
+
+The following sequence diagram shows detailed steps for initiation of a Teams VoIP call:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Load users from Fabrikam's organization and their identifiers: Contoso client application utilizes Graph API to get a list of users from Fabrikam's tenant. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](https://docs.microsoft.com/graph/api/user-list).
+
+```
+GET https://graph.microsoft.com/v1.0/users
+Permissions: User.ReadBasic.All (delegated)
+Response: response.body.value[1].displayName; // ΓÇ¥Megan BowenΓÇ¥
+ response.body.value[1].id; // "e8b753b5-4117-464e-9a08-713e1ff266b3"
+```
+
+Contoso's client application will then show the list of users and the ability to initiate a call to a given user.
+
+3. Initiate a call to Megan: Alice selects a button to initiate a Teams VoIP call to Megan in the Contoso's Client application. Client application uses Azure Communication Services calling SDK to provide the calling capability. Calls in Teams Clients are associated with Teams chat. First, the application requests creation of a dedicated chat for the VoIP call.
+
+```
+POST https://graph.microsoft.com/v1.0/chats
+Body:
+{
+ "chatType": "oneOnOne",
+ "members": [
+ {
+ "@odata.type": "#microsoft.graph.aadUserConversationMember",
+ "roles": [
+ "owner"
+ ],
+ "user@odata.bind": "https://graph.microsoft.com/v1.0/users('8c0a1a67-50ce-4114-bb6c-da9c5dbcf6ca')"
+ },
+ {
+ "@odata.type": "#microsoft.graph.aadUserConversationMember",
+ "roles": [
+ "owner"
+ ],
+ "user@odata.bind": "https://graph.microsoft.com/v1.0/users('e8b753b5-4117-464e-9a08-713e1ff266b3')"
+ }
+ ]
+}
+Permissions: Chat.Create (delegated)
+Response: response.body.value.id; // "19:8c0a1a67-50ce-4114-bb6c-da9c5dbcf6ca_e8b753b5-4117-464e-9a08-713e1ff266b3@unq.gbl.spaces"
+```
+
+Then the client application creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then you start a call to Megan's Teams ID.
+
+```js
+var teamsUser = { microsoftTeamsUserId: 'e8b753b5-4117-464e-9a08-713e1ff266b3'};
+const oneToOneCall = callAgent.startCall([teamsUser], { threadId: '19:8c0a1a67-50ce-4114-bb6c-da9c5dbcf6ca_e8b753b5-4117-464e-9a08-713e1ff266b3@unq.gbl.spaces' });
+```
+
+4. Connecting VoIP call to Megan: The call is routed through the Teams and ringing Teams clients associated with Megan. Megan sees an incoming call from Alice with the name defined in the Azure AD.
+5. Megans accepts the call: Megan accepts the call and the connection between Alice and Megan is established.
++
+## Use case 4: Receive inbound Teams VoIP call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to receive a Teams VoIP call via a custom website that takes the identity of the Teams user and applies routing policies applied to the Teams user.
+
+![Diagram is showing user experience of Alice receiving Teams VoIP call from customer Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-voip-in-overview.svg)
+
+The following sequence diagram shows detailed steps for accepting incoming Teams VoIP calls:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Subscribe for receiving calls: Client application uses Azure Communication Services calling SDK to provide the calling capability. First, it creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then application subscribes to the incoming call event.
+
+```js
+const incomingCallHandler = async (args: { incomingCall: IncomingCall }) => {
+ // Get information about caller
+ var callerInfo = incomingCall.callerInfo
+
+ showIncomingCall(callerInfo,incomingCall);
+};
+callAgent.on('incomingCall', incomingCallHandler);
+```
+The method _showIncomingCall_ is a custom Contoso's method that will render a user interface to indicate incoming calls and two buttons to accept and decline the call. If you select accept button then the following code is used:
+
+```js
+// Accept the call
+var call = await incomingCall.accept();
+```
+
+If you select the decline button, then the following code is used:
++
+```js
+// Reject the call
+incomingCall.reject();
+```
+
+3. Megan start's a VoIP call to Teams user Alice: Megan uses her Teams desktop client to call Alice. The Teams infrastructure will ring all endpoints associated with Alice. It includes: Teams desktop, mobile, web clients, and applications based on Azure Communication Services calling SDK.
+4. Contoso's client application shows Megan's incoming call: Client application receives incoming call notification. _showIncomingCall_ method would use Graph API to translate the Teams user ID to display name.
+
+```
+GET https://graph.microsoft.com/v1.0/users/e8b753b5-4117-464e-9a08-713e1ff266b3
+Permissions: User.Read (delegated)
+Response: response.body.value.displayName; // ΓÇ¥Megan BowenΓÇ¥
+ response.body.value.id; // "e8b753b5-4117-464e-9a08-713e1ff266b3"
+```
+
+When the information is retrieved, the notification is shown to Alice in Contoso's client application.
+
+5. Alice accepts the call: Alice selects a button to accept the call and the connection between Alice and Megan is established.
+
+## Use case 5: Join Teams meeting
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to join Teams meetings via a custom website that takes the identity of the Teams user.
+
+![Diagram is showing user experience of Alice joining Teams Meeting.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-meeting-overview.svg)
+
+The following sequence diagram shows detailed steps for joining a Teams meeting:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Load Teams meetings and their identifiers: Contoso client application utilizes Graph API to get a list of Teams meetings for Fabrikam's users. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](https://docs.microsoft.com/graph/api/user-list-calendarview).
+
+```
+GET https://graph.microsoft.com/v1.0/me/calendar/calendarView?startDateTime={start_datetime}&endDateTime={end_datetime}
+Permissions: Calendars.Read (delegated)
+Response: response.body.value[0].subject; // ΓÇ¥Project TailspinΓÇ¥
+ response.body.value[0].onlineMeeting.joinUrl; // "https://teams.microsoft.com/l/meetup-join/..."
+ response.body.value[0].start.dateTime;
+ response.body.value[0].end.dateTime;
+ response.body.value[0].location.displayName;
+```
+
+Contoso's client application will then show the list of Teams meetings and the ability to join them.
+
+3. Join Teams meeting "Project Tailspin": Alice selects a button to join Teams meeting "Project Tailspin" in the Contoso's Client application. Client application uses Azure Communication Services calling SDK to provide the calling capability. Client applications create an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then application joins a meeting via received joinUrl.
+
+```js
+var meetingLocator = new TeamsMeetingLinkLocator("https://teams.microsoft.com/l/meetup-join/...");
+callAgent.startCallJoinAsync(meetingLocator , new JoinCallOptions());
+```
+
+Alice then joins the Teams meeting.
+
+4. Other participants joining the Teams meeting: The provided experience is a standard Teams meeting. Based on the configuration and invites, the Teams meeting can be joined by Teams user, Teams anonymous user using Team web client, Teams desktop client, Teams mobile client, Azure Communication Services user via applications based on Communication Services calling SDK or users using phones.
+
+## Next steps
+
+The following articles might be of interest to you:
+
+- Learn more about [authentication](../authentication.md).
+- Try [quickstart for authentication of Teams users](../../quickstarts/manage-teams-identity.md).
+- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
communication-services Join Teams Meeting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/join-teams-meeting.md
> [!IMPORTANT] > BYOI interoperability is now generally available to all Communication Services applications and Teams organizations.
-Azure Communication Services can be used to build applications that enable users to join and participate in Teams meetings. [Standard ACS pricing](https://azure.microsoft.com/pricing/details/communication-services/) applies to these users, but there's no additional fee for the interoperability capability itself. With the bring your own identity (BYOI) model, you control user authentication and users of your applications don't need Teams licenses to join Teams meetings. This is ideal for business-to-consumer solutions that enable licensed Teams users (for example, healthcare providers or financial advisors) and external users (for example, patients or clients) using a custom application to join into a virtual consultation experience.
+Azure Communication Services can be used to build applications that enable users to join and participate in Teams meetings. [Standard ACS pricing](https://azure.microsoft.com/pricing/details/communication-services/) applies to these users, but there's no additional fee for the interoperability capability itself. With the bring your own identity (BYOI) model, you control user authentication and users of your applications don't need Teams licenses to join Teams meetings. This is ideal for applications that enable licensed Teams users and external users using a custom application to join into a virtual consultation experience. For example, healthcare providers using Teams can conduct teleheath virtual visits with their patients who use a custom application.
It's also possible to use Teams identities with the Azure Communication Services SDKs. More information is available [here](./teams-interop.md).
It's currently not possible for a Teams user to join a call that was initiated u
## Enabling anonymous meeting join in your Teams tenant
-When a BYOI user joins a Teams meeting, they're treated as an anonymous external user, similar to users that join a Teams meeting anonymously using the Teams web application. The ability for BYOI users to join Teams meetings as anonymous users is controlled by the existing "allow anonymous meeting join" configuration, which also controls the existing Teams anonymous meeting join. This setting can be updated in the [Teams admin center](https://admin.teams.microsoft.com/meetings/settings) or with the Teams PowerShell cmdlet [Set-CsTeamsMeetingConfiguration](/powershell/module/skype/set-csteamsmeetingconfiguration).
+When a BYOI user joins a Teams meeting, they're treated as an anonymous external user, similar to users that join a Teams meeting anonymously using the Teams web application. The ability for BYOI users to join Teams meetings as anonymous users is controlled by the existing "allow anonymous meeting join" configuration. This same configuration also controls the existing Teams anonymous meeting join. This setting can be updated in the [Teams admin center](https://admin.teams.microsoft.com/meetings/settings) or with the Teams PowerShell cmdlet [Set-CsTeamsMeetingConfiguration](/powershell/module/skype/set-csteamsmeetingconfiguration).
-Custom applications built with Azure Communication Services to connect and communicate with Teams users may be used by end users or by bots, and there is no differentiation in how they appear to Teams users unless the developer of the application explicitly indicates this as part of the communication. Your custom application should consider user authentication and other security measures to protect Teams meetings. Be mindful of the security implications of enabling anonymous users to join meetings, and use the [Teams security guide](/microsoftteams/teams-security-guide#addressing-threats-to-teams-meetings) to configure capabilities available to anonymous users.
+Custom applications built with Azure Communication Services to connect and communicate with Teams users can be used by end users or by bots, and there is no differentiation in how they appear to Teams users unless the developer of the application explicitly indicates this as part of the communication. Your custom application should consider user authentication and other security measures to protect Teams meetings. Be mindful of the security implications of enabling anonymous users to join meetings, and use the [Teams security guide](/microsoftteams/teams-security-guide#addressing-threats-to-teams-meetings) to configure capabilities available to anonymous users.
## Meeting experience
-As with Teams anonymous meeting join, your application must have the meeting link to join, which can be retrieved via the Graph API or from the calendar in Microsoft Teams. The name of BYOI users displayed in Teams is configurable via the Communication Services Calling SDK and they're labeled as ΓÇ£externalΓÇ¥ to let Teams users know they haven't been authenticated using Azure Active Directory. When the first ACS user joins a Teams meeting, the Teams client will display a message indicating that some features might not be available because one of the participants is using a custom client.
+As with Teams anonymous meeting join, your application must have the meeting link to join, which can be retrieved via the Graph API or from the calendar in Microsoft Teams. The name of BYOI users that is displayed in Teams is configurable via the Communication Services Calling SDK. They are labeled as ΓÇ£externalΓÇ¥ to let Teams users know they weren't authenticated using Azure Active Directory.
-A Communication Service user will not be admitted to a Teams meeting until there is at least one Teams user present in the meeting. Once a Teams user is present, then the Communication Services user will wait in the lobby until explicitly admitted by a Teams user, unless the "Who can bypass the lobby?" meeting policy/setting is set to "Everyone".
+A Communication Service user won't be admitted to a Teams meeting until there is at least one Teams user present in the meeting. Once a Teams user is present, then the Communication Services user will wait in the lobby until explicitly admitted by a Teams user, unless the "Who can bypass the lobby?" meeting policy/setting is set to "Everyone".
-During a meeting, Communication Services users will be able to use core audio, video, screen sharing, and chat functionality via Azure Communication Services SDKs. Once a Communication Services user leaves the meeting or the meeting ends, they can no longer send or receive new chat messages, but they will have access to messages sent and received during the meeting. Anonymous Communication Services users cannot add/remove participants to/from the meeting and they cannot start recording or transcription for the meeting.
+During a meeting, Communication Services users will be able to use core audio, video, screen sharing, and chat functionality via Azure Communication Services SDKs. Once a Communication Services user leaves the meeting or the meeting ends, they're no longer able to send or receive new chat messages, but they'll have access to messages sent and received during the meeting. Anonymous Communication Services users can't add/remove participants to/from the meeting nor can they start recording or transcription for the meeting.
Additional information on required dataflows for joining Teams meetings is available at the [client and server architecture page](client-and-server-architecture.md). The [Group Calling Hero Sample](../samples/calling-hero-sample.md) provides example code for joining a Teams meeting from a web application.
+## Diagnostics and call analytics
+After a Teams meeting ends, diagnostic information about the meeting is available using the [Communication Services logging and diagnostics](/azure/communication-services/concepts/logging-and-diagnostics) and using the [Teams Call Analytics](/MicrosoftTeams/use-call-analytics-to-troubleshoot-poor-call-quality) in the Teams admin center. Communication Services users will appear as "Anonymous" in Call Analytics screens. Communication Services users aren't included in the [Teams real-time Analytics](/microsoftteams/use-real-time-telemetry-to-troubleshoot-poor-meeting-quality).
+ ## Privacy Interoperability between Azure Communication Services and Microsoft Teams enables your applications and users to participate in Teams calls, meetings, and chat. It is your responsibility to ensure that the users of your application are notified when recording or transcription are enabled in a Teams call or meeting.
Microsoft will indicate to you via the Azure Communication Services API that rec
## Limitations and known issues -- Communication Services users may join a Teams meeting that is scheduled for a Teams channel and use audio and video, but they will not be able to send or receive any chat messages, since they are not members of the channel.-- Communication Services users may join a Teams webinar, but the presenter and attendee roles are not currently enforced, thus Communication Services users could perform actions not intended for attendees, such as screen sharing, turning their camera on/off, or unmuting themselves, if your application provides UX for those actions.
+- Communication Services users can join a Teams meeting that is scheduled for a Teams channel and use audio and video, but they won't be able to send or receive any chat messages because they aren't members of the channel.
+- Communication Services users may join a Teams webinar, but the presenter and attendee roles aren't currently enforced, thus Communication Services users could perform actions not intended for attendees, such as screen sharing, turning their camera on/off, or unmuting themselves, if your application provides UX for those actions.
- When using Microsoft Graph to [list the participants in a Teams meeting](/graph/api/call-list-participants), details for Communication Services users are not currently included.-- PowerPoint presentations are not rendered for Communication Services users.
+- PowerPoint presentations aren't rendered for Communication Services users.
- Teams meetings support up to 1000 participants, but the Azure Communication Services Calling SDK currently only supports 350 participants and Chat SDK supports 250 participants. - With [Cloud Video Interop for Microsoft Teams](/microsoftteams/cloud-video-interop), some devices have seen issues when a Communication Services user shares their screen.-- [Communication Services voice and video calling events](../../event-grid/communication-services-voice-video-events.md) are not raised for Teams meeting.
+- [Communication Services voice and video calling events](/azure/event-grid/communication-services-voice-video-events) aren't raised for Teams meeting.
- Features such as reactions, raised hand, together mode, and breakout rooms are only available for Teams users.-- Communication Services users cannot interact with poll or Q&A apps in meetings.-- Communication Services won't have access to all chat features supported by Teams. They can send and receive text messages, use typing indicators, read receipts and other features supported by Chat SDK. However features like file sharing, reply or react to a message are not supported for Communication Services users. -- The Calling SDK does not currently support closed captions for Teams meetings.-- Communication Services users are not included in the [Real-Time Analytics](/microsoftteams/use-real-time-telemetry-to-troubleshoot-poor-meeting-quality).-- Communication Services users cannot join [Teams live events](/microsoftteams/teams-live-events/what-are-teams-live-events).-- [Teams activity handler events](/microsoftteams/platform/bots/bot-basics?tabs=csharp) for bots do not fire when Communication Services users join a Teams meeting.
+- Communication Services users can't interact with poll or Q&A apps in meetings.
+- Communication Services won't have access to all chat features supported by Teams. They can send and receive text messages, use typing indicators, read receipts and other features supported by Chat SDK. However features like file sharing, reply or react to a message aren't supported for Communication Services users.
+- The Calling SDK doesn't currently support closed captions for Teams meetings.
+- Communication Services users can't join [Teams live events](/microsoftteams/teams-live-events/what-are-teams-live-events).
+- [Teams activity handler events](/microsoftteams/platform/bots/bot-basics?tabs=csharp) for bots don't fire when Communication Services users join a Teams meeting.
## Next steps
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/router/concepts.md
Azure Communication Services Job Router solves the problem of matching supply with demand.
-A real-world example of this may be call center agents (supply) being matched to incoming support calls (demand).
+A real-world example of this is matching call center agents (supply) to incoming support calls (demand).
## Job
-A Job represents a unit of work (demand), which needs to be routed to an available Worker (supply).
+A Job is a unit of work (demand), which must be routed to an available Worker (supply).
-A real-world example of this may be an incoming call or chat in the context of a call center.
+A real-world example is an incoming call or chat in the context of a call center.
### Job submission flow 1. Your application submits a Job via the Job Router SDK.
-2. The Job is classified and a [JobClassified Event][job_classified_event] is sent via EventGrid, which includes all the information about the Job and how the classification process may have modified its properties.
-
- :::image type="content" source="../media/router/acs-router-job-submission.png" alt-text="Diagram showing Communication Services' Job Router submitting a job.":::
+2. The Job is classified and a [JobClassified Event][job_classified_event] is sent via Event Grid.
+
+ :::image type="content" source="../media/router/acs-router-job-submission.png" alt-text="Diagram of job submission.":::
## Worker
-A Worker represents the supply available to handle a Job. Each worker registers with one or more queues to receive jobs.
+A Worker is the supply available to handle a Job. Each worker registers with one or more queues to receive jobs.
-A real-world example of this may be an agent working in a call center.
+A real-world example is an agent in a call center.
### Worker registration flow 1. When your Worker is ready to take on work, you can register the worker via the Job Router SDK. 2. Job Router then sends a [WorkerRegistered Event][worker_registered_event]
- :::image type="content" source="../media/router/acs-router-worker-registration.png" alt-text="Diagram showing Communication Services' Job Router worker registration.":::
+ :::image type="content" source="../media/router/acs-router-worker-registration.png" alt-text="Diagram of worker registration.":::
## Queue
-A Queue represents an ordered list of jobs waiting to be served by a worker. Workers will register with a queue to receive work from it.
+A Queue is an ordered list of jobs, that are waiting to be served by a worker. Workers register with a queue to receive work from it.
-A real-world example of this may be a call queue in a call center.
+A real-world example is a call queue in a call center.
## Channel
-A Channel represents a grouping of jobs by some type. When a worker registers to receive work, they must also specify for which channels they can handle work, and how much of each can they handle concurrently. Channels are just a string discriminator and aren't explicitly created.
+A Channel is a grouping of jobs by some type. When a worker registers to receive work, they must also specify for which channels they can handle work, and how much of each can they handle concurrently. Channels are just a string discriminator and aren't explicitly created.
-A real-world example of this may be `voice calls` or `chats` in a call center.
+Real-world examples are `voice calls` or `chats` in a call center.
## Offer
-An Offer is extended by JobRouter to a worker to handle a particular job when it determines a match. When this happens, you'll be notified via [EventGrid][subscribe_events]. You can either accept or decline the offer using the JobRouter SDK, or it will expire according to the time to live configured on the Distribution Policy.
+An Offer is extended by Job Router to a worker to handle a particular job when it determines a match. You can either accept or decline the offer with the JobRouter SDK. If you ignore the offer, it expires according to the time to live configured on the Distribution Policy.
-A real-world example of this may be the ringing of an agent in a call center.
+A real-world example is the ringing of an agent in a call center.
### Offer flow
-1. When Job Router finds a matching Worker for a Job, it offers the work by sending a [OfferIssued Event][offer_issued_event] via EventGrid.
+1. When Job Router finds a matching Worker for a Job, it creates an Offer and sends an [OfferIssued Event][offer_issued_event] via [Event Grid][subscribe_events].
2. The Offer is accepted via the Job Router API.
-3. Job Router sends a [OfferAccepted Event][offer_accepted_event] signifying to the Contoso Application the Worker is assigned to the Job.
+3. Job Router sends an [OfferAccepted Event][offer_accepted_event].
:::image type="content" source="../media/router/acs-router-accept-offer.png" alt-text="Diagram showing Communication Services' Job Router accept offer."::: ## Distribution Policy
-A Distribution Policy represents a configuration set that controls how jobs in a queue are distributed to workers registered with that queue.
+A Distribution Policy is a configuration set that controls how jobs in a queue are distributed to workers registered with that queue.
This configuration includes: - How long an Offer is valid before it expires.
This configuration includes:
### Distribution modes
-The 3 types of modes are
+The three types of modes are
- **Round Robin**: Workers are ordered by `Id` and the next worker after the previous one that got an offer is picked. - **Longest Idle**: The worker that has not been working on a job for the longest.-- **Best Worker**: The workers that are best able to handle the job will be picked first. The logic to determine this can be optionally customized by specifying an expression or azure function to compare 2 workers and determine which one to pick.
+- **Best Worker**: The workers that are best able to handle the job are picked first. The logic to rank Workers can be customized, with an expression or Azure function to compare two workers.
## Labels
-You can attach labels to workers, jobs, and queues. These are key value pairs that can be of `string`, `number` or `boolean` data types.
+You can attach labels to workers, jobs, and queues. Labels are key value pairs that can be of `string`, `number`, or `boolean` data types.
-A real-world example of this may be the skill level of a particular worker or the team or geographic location.
+A real-world example is the skill level of a particular worker or the team or geographic location.
## Label selectors
-Label selectors can be attached to a job in order to target a subset of workers serving the queue.
+Label selectors can be attached to a job in order to target a subset of workers on the queue.
-A real-world example of this may be a condition on an incoming call that the agent must have a minimum level of knowledge of a particular product.
+A real-world example is a condition on an incoming call that the agent must have a minimum level of knowledge of a particular product.
## Classification policy
-A classification policy can be used to dynamically select a queue, determine job priority and attach worker label selectors to a job by leveraging a rules engine.
+A classification policy can be used to programmatically select a queue, determine job priority, or attach worker label selectors to a job.
## Exception policy
An exception policy controls the behavior of a Job based on a trigger and execut
- [Exception Policies](exception-policy.md) - [Quickstart guide](../../quickstarts/router/get-started-router.md) - [Manage queues](../../how-tos/router-sdk/manage-queue.md)-- [Classifying a Job](../../how-tos/router-sdk/job-classification.md)
+- [How to classify a Job](../../how-tos/router-sdk/job-classification.md)
+- [Target a preferred worker](../../how-tos/router-sdk/preferred-worker.md)
- [Escalate a Job](../../how-tos/router-sdk/escalate-job.md) - [Subscribe to events](../../how-tos/router-sdk/subscribe-events.md)
communication-services Direct Routing Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony/direct-routing-infrastructure.md
Title: Azure direct routing infrastructure requirements - Azure Communication Services
+ Title: Azure direct routing infrastructure requirements ΓÇö Azure Communication Services
description: Familiarize yourself with the infrastructure requirements for Azure Communication Services direct routing configuration
[!INCLUDE [Public Preview](../../includes/public-preview-include-document.md)]
-This article describes infrastructure, licensing, and Session Border Controller (SBC) connectivity details that you'll want to keep in mind as your plan your Azure direct routing deployment.
+This article describes infrastructure, licensing, and Session Border Controller (SBC) connectivity details that you want to keep in mind as your plan your Azure direct routing deployment.
## Infrastructure requirements
The infrastructure requirements for the supported SBCs, domains, and other netwo
|Infrastructure requirement|You need the following| |: |: | |Session Border Controller (SBC)|A supported SBC. For more information, see [Supported SBCs](#supported-session-border-controllers-sbcs).|
-|Telephony trunks connected to the SBC|One or more telephony trunks connected to the SBC. On one end, the SBC connects to the Azure Communication Service via direct routing. The SBC can also connect to third-party telephony entities, such as PBXs, Analog Telephony Adapters, and so on. Any PSTN connectivity option connected to the SBC will work. (For configuration of the PSTN trunks to the SBC, refer to the SBC vendors or trunk providers.)|
+|Telephony trunks connected to the SBC|One or more telephony trunks connected to the SBC. On one end, the SBC connects to the Azure Communication Service via direct routing. The SBC can also connect to third-party telephony entities, such as PBXs, Analog Telephony Adapters. Any Public Switched Telephony Network (PSTN) connectivity option connected to the SBC works. (For configuration of the PSTN trunks to the SBC, refer to the SBC vendors or trunk providers.)|
|Azure subscription|An Azure subscription that you use to create Communication Services resource, and the configuration and connection to the SBC.| |Communication Services Access Token|To make calls, you need a valid Access Token with `voip` scope. See [Access Tokens](../identity-model.md#access-tokens)| |Public IP address for the SBC|A public IP address that can be used to connect to the SBC. Based on the type of SBC, the SBC can use NAT.|
-|Fully Qualified Domain Name (FQDN) for the SBC|An FQDN for the SBC, where the domain portion of the FQDN does not match registered domains in your Microsoft 365 or Office 365 organization. For more information, see [SBC domain names](#sbc-domain-names).|
-|Public DNS entry for the SBC |A public DNS entry mapping the SBC FQDN to the public IP Address. |
-|Public trusted certificate for the SBC |A certificate for the SBC to be used for all communication with Azure direct routing. For more information, see [Public trusted certificate for the SBC](#public-trusted-certificate-for-the-sbc).|
+|Fully Qualified Domain Name (FQDN) for the SBC|An FQDN for the SBC, where the domain portion of the FQDN doesnΓÇÖt match registered domains in your Microsoft 365 or Office 365 organization. For more information, see [SBC certificates and domain names](#sbc-certificates-and-domain-names).|
+|Public DNS entry for the SBC |A public DNS entry mapping the SBC FQDN to the public IP address. |
+|Public trusted certificate for the SBC |A certificate for the SBC to be used for all communication with Azure direct routing. For more information, see [SBC certificates and domain names](#sbc-certificates-and-domain-names).|
|Firewall IP addresses and ports for SIP signaling and media |The SBC communicates to the following services in the cloud:<br/><br/>SIP Proxy, which handles the signaling<br/>Media Processor, which handles media<br/><br/>These two services have separate IP addresses in Microsoft Cloud, described later in this document.
-## SBC domain names
+## SBC certificates and domain names
-Customers without Office 365 can use any domain name for which they can obtain a public certificate.
+Microsoft recommends that you request the certificate for the SBC by a certification signing request (CSR). For specific instructions on how to generate a CSR for an SBC, refer to the interconnection instructions or documentation provided by your SBC vendors.
-The following table shows examples of DNS names registered for the tenant, whether the name can be used as a fully qualified domain name (FQDN) for the SBC, and examples of valid FQDN names:
+ >[!NOTE]
+ > Most Certificate Authorities (CAs) require the private key size to be at least 2048. Keep this in mind when you generate the CSR.
-|DNS name|Can be used for SBC FQDN|Examples of FQDN names|
-|: |: |: |
-contoso.com|Yes|**Valid names:**<br/>sbc1.contoso.com<br/>ssbcs15.contoso.com<br/>europe.contoso.com|
-|contoso.onmicrosoft.com|No|Using *.onmicrosoft.com domains is not supported for SBC names
+The certificate must have the SBC FQDN as the common name (CN) or the subject alternative name (SAN) field. The certificate should be issued directly from a certification authority, not an intermediate provider.
-If you are an Office 365 customer, then the SBC domain name must not match registered in Domains of the Office 365 tenant. Below is the example of Office 365 and Azure Communication Service coexistence:
+Alternatively, Communication Services direct routing supports a wildcard in the CN and/or SAN, and the wildcard must conform to standard [RFC HTTP Over TLS](https://tools.ietf.org/html/rfc2818#section-3.1).
-|Domain registered in Office 365|Examples of SBC FQDN in Teams|Examples of SBC FQDN names in Azure Communication Services|
-|: |: |: |
-**contoso.com** (second level domain)|**sbc.contoso.com** (name in the second level domain)|**sbc.acs.contoso.com** (name in the third level domain)<br/>**sbc.fabrikam.com** (any name within different domain)|
-|**o365.contoso.com** (third level domain)|**sbc.o365.contoso.com** (name in the third level domain)|**sbc.contoso.com** (name in the second level domain)<br/>**sbc.acs.o365.contoso.com** (name in the fourth level domain)<br/>**sbc.fabrikam.com** (any name within different domain)
+Customers who already use Office 365 and have a domain registered in Microsoft 365 Admin Center can use SBC FQDN from the same domain.
+Domains that arenΓÇÖt previously used in O365 must be provisioned.
-SBC pairing works on the Communication Services resource level, meaning you can pair many SBCs to a single Communication Services resource, but you cannot pair a single SBC to more than one Communication Services resource. Unique SBC FQDNs are required for pairing to different resources.
-
-## Public trusted certificate for the SBC
-
-Microsoft recommends that you request the certificate for the SBC by generating a certification signing request (CSR). For specific instructions on generating a CSR for an SBC, refer to the interconnection instructions or documentation provided by your SBC vendors.
+An example would be using `\*.contoso.com`, which would match the SBC FQDN `sbc.contoso.com`, but wouldn't match with `sbc.test.contoso.com`.
- > [!NOTE]
- > Most Certificate Authorities (CAs) require the private key size to be at least 2048. Keep this in mind when generating the CSR.
+>[!IMPORTANT]
+>During Public Preview only: if you plan to use a wildcard certificate for the domain that is not registered in Teams, please raise a support ticket, and our team will add it as a trusted domain.
-The certificate needs to have the SBC FQDN as the common name (CN) or the subject alternative name (SAN) field. The certificate should be issued directly from a certification authority, not from an intermediate provider.
+Communication Services only trusts certificates signed by Certificate Authorities (CAs) that are part of the Microsoft Trusted Root Certificate Program. Ensure that your SBC certificate is signed by a CA that is part of the program, and that Extended Key Usage (EKU) extension of your certificate includes Server Authentication.
+Learn more:
-Alternatively, Communication Services direct routing supports a wildcard in the CN and/or SAN, and the wildcard needs to conform to standard [RFC HTTP Over TLS](https://tools.ietf.org/html/rfc2818#section-3.1).
+[Program Requirements ΓÇö Microsoft Trusted Root Program](/security/trusted-root/program-requirements)
+
+[Included CA Certificate List](https://ccadb-public.secure.force.com/microsoft/IncludedCACertificateReportForMSFT)
-An example would be using `\*.contoso.com`, which would match the SBC FQDN `sbc.contoso.com`, but wouldn't match with `sbc.test.contoso.com`.
+SBC pairing works on the Communication Services resource level. It means you can pair many SBCs to a single Communication Services resource. Still, you cannot pair a single SBC to more than one Communication Services resource. Unique SBC FQDNs are required for pairing to different resources.
-The certificate needs to be generated by one of the following root certificate authorities:
--- AffirmTrust-- AddTrust External CA Root-- Baltimore CyberTrust Root*-- Buypass-- Cybertrust-- Class 3 Public Primary Certification Authority-- Comodo Secure Root CA-- Deutsche Telekom -- DigiCert Global Root CA-- DigiCert High Assurance EV Root CA-- Entrust-- GlobalSign-- Go Daddy-- GeoTrust-- Verisign, Inc. -- SSL.com-- Starfield-- Symantec Enterprise Mobile Root for Microsoft -- SwissSign-- Thawte Timestamping CA-- Trustwave-- TeliaSonera -- T-Systems International GmbH (Deutsche Telekom)-- QuoVadis-- USERTrust RSA Certification Authority-- Hongkong Post Root CA 1,2,3-- Sectigo Root CA-
-Microsoft is working on adding more certification authorities based on customer requests.
## SIP Signaling: FQDNs The connection points for Communication Services direct routing are the following three FQDNs: -- **sip.pstnhub.microsoft.com** ΓÇô Global FQDN ΓÇô must be tried first. When the SBC sends a request to resolve this name, the Microsoft Azure DNS servers return an IP address pointing to the primary Azure datacenter assigned to the SBC. The assignment is based on performance metrics of the datacenters and geographical proximity to the SBC. The IP address returned corresponds to the primary FQDN.-- **sip2.pstnhub.microsoft.com** ΓÇô Secondary FQDN ΓÇô geographically maps to the second priority region.-- **sip3.pstnhub.microsoft.com** ΓÇô Tertiary FQDN ΓÇô geographically maps to the third priority region.
+- **sip.pstnhub.microsoft.com ΓÇö Global FQDN ΓÇö must be tried first. When the SBC sends a request to resolve this name, the Microsoft Azure DNS servers return an IP address that points to the primary Azure datacenter assigned to the SBC. The assignment is based on performance metrics of the datacenters and geographical proximity to the SBC. The IP address returned corresponds to the primary FQDN.
+- **sip2.pstnhub.microsoft.com ΓÇö Secondary FQDN ΓÇö geographically maps to the second priority region.
+- **sip3.pstnhub.microsoft.com ΓÇö Tertiary FQDN ΓÇö geographically maps to the third priority region.
-Placing these three FQDNs in order is required to:
+These three FQDNs in order are required to:
- Provide optimal experience (less loaded and closest to the SBC datacenter assigned by querying the first FQDN).-- Provide failover when connection from an SBC is established to a datacenter that is experiencing a temporary issue. For more information, see [Failover mechanism](#failover-mechanism-for-sip-signaling) below.
+- Provide failover when connection from an SBC is established to a datacenter that is experiencing a temporary issue. For more information, see [Failover mechanism](#failover-mechanism-for-sip-signaling).
-The FQDNs ΓÇô sip.pstnhub.microsoft.com, sip2.pstnhub.microsoft.com, and sip3.pstnhub.microsoft.com ΓÇô will be resolved to one of the following IP addresses:
+The FQDNs ΓÇö sip.pstnhub.microsoft.com, sip2.pstnhub.microsoft.com, and sip3.pstnhub.microsoft.com ΓÇö resolve to one of the following IP addresses:
- `52.112.0.0/14 (IP addresses from 52.112.0.1 to 52.115.255.254)` - `52.120.0.0/14 (IP addresses from 52.120.0.1 to 52.123.255.254)`
Use the following ports for Communication Services Azure direct routing:
|Traffic|From|To|Source port|Destination port| |: |: |: |: |: |
-|SIP/TLS|SIP Proxy|SBC|1024 ΓÇô 65535|Defined on the SBC (For Office 365 GCC High/DoD only port 5061 must be used)|
+|SIP/TLS|SIP Proxy|SBC|1024ΓÇô65535|Defined on the SBC (For Office 365 GCC High/DoD only port 5061 must be used)|
SIP/TLS|SBC|SIP Proxy|Defined on the SBC|5061| ### Failover mechanism for SIP Signaling
-The SBC makes a DNS query to resolve sip.pstnhub.microsoft.com. Based on the SBC location and the datacenter performance metrics, the primary datacenter is selected. If the primary datacenter experiences an issue, the SBC will try the sip2.pstnhub.microsoft.com, which resolves to the second assigned datacenter, and, in the rare case that datacenters in two regions are not available, the SBC retries the last FQDN (sip3.pstnhub.microsoft.com), which provides the tertiary datacenter IP.
+The SBC makes a DNS query to resolve sip.pstnhub.microsoft.com. Based on the SBC location and the datacenter performance metrics, the primary datacenter is selected. If the primary datacenter experiences an issue, the SBC tries the sip2.pstnhub.microsoft.com, which resolves to the second assigned datacenter, and, in the rare case that datacenters in two regions arenΓÇÖt available, the SBC retries the last FQDN (sip3.pstnhub.microsoft.com), which provides the tertiary datacenter IP.
## Media traffic: IP and Port ranges
The port range of the Media Processors is shown in the following table:
|Traffic|From|To|Source port|Destination port| |: |: |: |: |: |
-|UDP/SRTP|Media Processor|SBC|3478-3481 and 49152 ΓÇô 53247|Defined on the SBC|
-|UDP/SRTP|SBC|Media Processor|Defined on the SBC|3478-3481 and 49152 ΓÇô 53247|
+|UDP/SRTP|Media Processor|SBC|3478ΓÇô3481 and 49152ΓÇô53247|Defined on the SBC|
+|UDP/SRTP|SBC|Media Processor|Defined on the SBC|3478ΓÇô3481 and 49152ΓÇô53247|
> [!NOTE] > Microsoft recommends at least two ports per concurrent call on the SBC.
You can force use of the specific codec on the Session Border Controller by excl
### Leg between Communication Services Calling SDK app and Cloud Media Processor
-On the leg between the Cloud Media Processor and Communication Services Calling SDK app, G.722 is used. Microsoft is working on adding more codecs on this leg.
+On the leg between the Cloud Media Processor and Communication Services Calling SDK app, G.722 is used. Work on adding more codecs on this leg is in progress.
## Supported Session Border Controllers (SBCs)
communication-services Preferred Worker https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/router-sdk/preferred-worker.md
+
+ Title: Target a Preferred Worker
+
+description: Use Azure Communication Services SDKs to target a job to a specific worker
++++ Last updated : 01/31/2022+
+zone_pivot_groups: acs-js-csharp
+
+#Customer intent: As a developer, I want to target a specific worker
++
+# Target a Preferred Worker
+
+In the context of a call center, customers might be assigned an account manager or have a relationship with a specific worker. As such, You'd want to route a specific job to a specific worker if possible.
++
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- Optional: Complete the quickstart to [get started with Job Router](../../quickstarts/router/get-started-router.md)
+
+## Setup worker selectors
+
+Every worker automatically has an `Id` label. You can apply worker selectors to the job, to target a specific worker.
+
+In the following example, a job is created that targets a specific worker. If that worker does not accept the job within the TTL of 1 minute, the condition for the specific worker is no longer be valid and the job could go to any worker.
++
+```csharp
+await client.CreateJobAsync(
+ channelId: "<channel id>",
+ queueId: "<queue id>",
+ workerSelectors: new List<LabelSelector>
+ {
+ new LabelSelector(
+ key: "Id",
+ @operator: LabelOperator.Equal,
+ value: "<preferred worker id>",
+ ttl: TimeSpan.FromMinutes(1))
+ });
+```
+++
+```typescript
+await client.createJob({
+ channelId: "<channel id>",
+ queueId: "<queue id>",
+ workerSelectors: [
+ {
+ key: "Id",
+ operator: "equal",
+ value: "<preferred worker id>",
+ ttl: "00:01:00"
+ }
+ ]
+});
+```
++
+> [!TIP]
+> You could also use any custom label that is unique to each worker.
container-apps Background Processing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/background-processing.md
# Tutorial: Deploy a background processing application with Azure Container Apps Preview
-Azure Container Apps allows you to deploy applications without requiring the exposure of public endpoints. In this tutorial, you deploy a sample application that reads messages from an Azure Storage Queue and logs the messages in Azure log Analytics workspace. Using Container Apps scale rules, the application can scale up and down based on the Azure Storage queue length. When there are no messages on the queue, the container app scales down to zero.
+Using Azure Container Apps allows you to deploy applications without requiring the exposure of public endpoints. By using Container Apps scale rules, the application can scale up and down based on the Azure Storage queue length. When there are no messages on the queue, the container app scales down to zero.
You learn how to: > [!div class="checklist"]+ > * Create a Container Apps environment to deploy your container apps > * Create an Azure Storage Queue to send messages to the container app > * Deploy your background processing application as a container app > * Verify that the queue messages are processed by the container app
-## Prerequisites
-
-The following items are required to complete this tutorial:
-
-* **Azure CLI**: You must have Azure CLI version 2.29.0 or later installed on your local computer.
- * Run `az --version` to find the version. If you need to install or upgrade, see [Install the Azure CLI](/cli/azure/install-azure-cli).
-
-## Setup
-
-This tutorial makes use of the following environment variables:
-
-# [Bash](#tab/bash)
-
-```bash
-RESOURCE_GROUP="my-containerapps"
-LOCATION="canadacentral"
-CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$RESOURCE_GROUP="my-containerapps"
-$LOCATION="canadacentral"
-$CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-$LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-```
---
-Create a variable for your storage account name.
-
-# [Bash](#tab/bash)
-
-```bash
-STORAGE_ACCOUNT="<MY_STORAGE_ACCOUNT_NAME>"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$STORAGE_ACCOUNT="<storage account name>"
-```
---
-Replace the `<storage account name>` placeholder with your own value before you run this snippet. Storage account names must be unique within Azure, be between 3 and 24 characters in length, and may contain numbers or lowercase letters only. The storage account will be created in a following step.
-
-Next, sign in to Azure from the CLI.
-
-Run the following command, and follow the prompts to complete the authentication process.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az login
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az login
-```
---
-To ensure you're running the latest version of the CLI, use the `upgrade` command.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az upgrade
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az upgrade
-```
---
-Next, install the Azure Container Apps extension to the CLI.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az extension add \
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az extension add `
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
---
-Now that the extension is installed, register the `Microsoft.Web` namespace.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az provider register --namespace Microsoft.Web
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az provider register --namespace Microsoft.Web
-```
---
-You'll use a resource group to organize the services related to your new container app. Create the group with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az group create \
- --name $RESOURCE_GROUP \
- --location "$LOCATION"
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az group create `
- --name $RESOURCE_GROUP `
- --location $LOCATION
-```
---
-With the CLI upgraded and a new resource group available, you can create a Container Apps environment and deploy your container app.
-
-## Create an environment
-
-Azure Container Apps environments act as secure boundary around a group of container apps. Different container apps in the same environment are deployed in the same virtual network and write logs to the same Log Analytics workspace.
-
-Azure Log Analytics is used to monitor your container app required when creating a Container Apps environment.
-
-Create a new Log Analytics workspace with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az monitor log-analytics workspace create \
- --resource-group $RESOURCE_GROUP \
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az monitor log-analytics workspace create `
- --resource-group $RESOURCE_GROUP `
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
---
-Next, retrieve the Log Analytics Client ID and client secret. Make sure to run each query separately to give enough time for the request to complete.
-
-# [Bash](#tab/bash)
-
-```azurecli
-LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"'`
-```
-
-```azurecli
-LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"'`
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"')
-```
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"'`
-```
az containerapp env create `
## Set up a storage queue
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure* and be from 3 to 24 characters in length containing numbers and lowercase letters only.
+
+# [Bash](#tab/bash)
+
+```bash
+STORAGE_ACCOUNT="<storage account name>"
+```
+
+# [PowerShell](#tab/powershell)
+
+```powershell
+$STORAGE_ACCOUNT="<storage account name>"
+```
+++ Create an Azure Storage account. # [Bash](#tab/bash)
az storage account create \
# [PowerShell](#tab/powershell)
-```azurecli
-az storage account create `
- --name $STORAGE_ACCOUNT `
- --resource-group $RESOURCE_GROUP `
- --location $LOCATION `
- --sku Standard_RAGRS `
- --kind StorageV2
+```powershell
+$STORAGE_ACCOUNT = New-AzStorageAccount `
+ -Name $STORAGE_ACCOUNT_NAME `
+ -ResourceGroupName $RESOURCE_GROUP `
+ -Location $LOCATION `
+ -SkuName Standard_RAGRS `
+ -Kind StorageV2
```
-Next, get the queue's connection string.
+Next, get the connection string for the queue.
# [Bash](#tab/bash)
QUEUE_CONNECTION_STRING=`az storage account show-connection-string -g $RESOURCE_
# [PowerShell](#tab/powershell) ```powershell
-$QUEUE_CONNECTION_STRING=(az storage account show-connection-string -g $RESOURCE_GROUP --name $STORAGE_ACCOUNT --query connectionString --out json | tr -d '"')
+ $QUEUE_CONNECTION_STRING=(az storage account show-connection-string -g $RESOURCE_GROUP --name $STORAGE_ACCOUNT_NAME --query connectionString --out json) -replace '"',''
```
az storage queue create \
# [PowerShell](#tab/powershell)
-```azurecli
-az storage queue create `
- --name "myqueue" `
- --account-name $STORAGE_ACCOUNT `
- --connection-string $QUEUE_CONNECTION_STRING
+```powershell
+$queue = New-AzStorageQueue ΓÇôName "myqueue" `
+ -Context $STORAGE_ACCOUNT.Context
```
az storage message put \
# [PowerShell](#tab/powershell) ```azurecli
-az storage message put `
- --content "Hello Queue Reader App" `
- --queue-name "myqueue" `
- --connection-string $QUEUE_CONNECTION_STRING
+$queueMessage = [Microsoft.Azure.Storage.Queue.CloudQueueMessage]::new("Hello Queue Reader App")
+$queue.CloudQueue.AddMessageAsync($QueueMessage)
```
az deployment group create --resource-group "$RESOURCE_GROUP" \
# [PowerShell](#tab/powershell)
-```azurecli
-az deployment group create --resource-group "$RESOURCE_GROUP" `
- --template-file ./queue.json `
- --parameters `
- environment_name="$CONTAINERAPPS_ENVIRONMENT" `
- queueconnection="$QUEUE_CONNECTION_STRING" `
- location="$LOCATION"
+```powershell
+$params = @{
+ environment_name = $CONTAINERAPPS_ENVIRONMENT
+ location = $LOCATION
+ queueconnection=$QUEUE_CONNECTION_STRING
+}
+
+New-AzResourceGroupDeployment `
+ -ResourceGroupName $RESOURCE_GROUP `
+ -TemplateParameterObject $params `
+ -TemplateFile ./queue.json `
+ -SkipTemplateParameterPrompt
```
The application scales up to 10 replicas based on the queue length as defined in
## Verify the result
-The container app running as a background process creates logs entries in Log analytics as messages arrive from Azure Storage Queue. You may need to wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
+The container app runs as a background process. As messages arrive from the Azure Storage Queue, the application creates log entries in Log analytics. You must wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
Run the following command to see logged messages. This command requires the Log analytics extension, so accept the prompt to install extension when requested.
az monitor log-analytics query \
# [PowerShell](#tab/powershell)
-```azurecli
-az monitor log-analytics query `
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'queuereader' and Log_s contains 'Message ID'" `
- --out table
+```powershell
+$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'queuereader' and Log_s contains 'Message ID'"
+$queryResults.Results
```
az monitor log-analytics query `
## Clean up resources
-Once you are done, clean up your Container Apps resources by running the following command to delete your resource group.
+Once you are done, run the following command to delete the resource group that contains your Container Apps resources.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --resource-group $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
container-apps Get Started Existing Container Image https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/get-started-existing-container-image.md
Title: 'Quickstart: Deploy an existing container image using the Azure CLI'
+ Title: 'Quickstart: Deploy an existing container image with the Azure CLI'
description: Deploy an existing container image to Azure Container Apps Preview with the Azure CLI.
zone_pivot_groups: container-apps-registry-types
# Quickstart: Deploy an existing container image with the Azure CLI
-Azure Container Apps Preview enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while leaving behind the concerns of manually configuring cloud infrastructure and complex container orchestrators.
+The Azure Container Apps Preview service enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while you leave behind the concerns of manual cloud infrastructure configuration and complex container orchestrators.
This article demonstrates how to deploy an existing container to Azure Container Apps.
This article demonstrates how to deploy an existing container to Azure Container
## Prerequisites -- Azure account with an active subscription.
+- An Azure account with an active subscription.
- If you don't have one, you [can create one for free](https://azure.microsoft.com/free/). - Install the [Azure CLI](/cli/azure/install-azure-cli).
az containerapp env create \
--resource-group $RESOURCE_GROUP \ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
+ --location $LOCATION
``` # [PowerShell](#tab/powershell)
az containerapp env create `
--resource-group $RESOURCE_GROUP ` --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID ` --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
+ --location $LOCATION
``` ## Create a container app
-Now that you have an environment created, you can deploy your first container app. Using the `containerapp create` command, deploy a container image to Azure Container Apps.
+Now that you have an environment created, you can deploy your first container app. With the `containerapp create` command, deploy a container image to Azure Container Apps.
-The example shown in this article demonstrates how to use a custom container image with common commands. Your container image may need more parameters including the following items:
+The example shown in this article demonstrates how to use a custom container image with common commands. Your container image might need more parameters for the following items:
-- Setting the revision mode-- Defining secrets-- Defining environment variables-- Setting container CPU or memory requirements-- Enabling and configuring Dapr-- Enabling internal or internal ingress-- Providing minimum and maximum replica values or scale rules
+- Set the revision mode
+- Define secrets
+- Define environment variables
+- Set container CPU or memory requirements
+- Enable and configure Dapr
+- Enable internal or internal ingress
+- Provide minimum and maximum replica values or scale rules
For details on how to provide values for any of these parameters to the `create` command, run `az containerapp create --help`.
Before you run this command, replace `<REGISTRY_CONTAINER_URL>` with the URL to
::: zone-end
-If you have enabled ingress on your container app, you can add `--query configuration.ingress.fqdn` to the `create` command to return the app's public URL.
+If you have enabled ingress on your container app, you can add `--query configuration.ingress.fqdn` to the `create` command to return the public URL for the application.
## Verify deployment
-To verify a successful deployment, you can query the Log Analytics workspace. You may need to wait a 5 to 10 minutes for the analytics to arrive for the first time before you are able to query the logs.
+To verify a successful deployment, you can query the Log Analytics workspace. You might have to wait 5ΓÇô10 minutes after deployment for the analytics to arrive for the first time before you are able to query the logs.
-After about 5 to 10 minutes has passed after creating the container app, use the following steps to view logged messages.
+After about 5-10 minutes has passed, use the following steps to view logged messages.
# [Bash](#tab/bash)
az monitor log-analytics query \
# [PowerShell](#tab/powershell)
-```azurecli
-az monitor log-analytics query `
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'my-container-app' | project ContainerAppName_s, Log_s, TimeGenerated" `
+```powershell
+$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5"
+$queryResults.Results
--out table ```
az monitor log-analytics query `
## Clean up resources
-If you're not going to continue to use this application, you can delete the Azure Container Apps instance and all the associated services by removing the resource group.
+If you're not going to continue to use this application, run the following command to delete the resource group along with all the resources created in this quickstart.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --name $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
container-apps Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/get-started.md
# Quickstart: Deploy your first container app
-Azure Container Apps Preview enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while leaving behind the concerns of manually configuring cloud infrastructure and complex container orchestrators.
+The Azure Container Apps Preview service enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while you leave behind the concerns of manually configuring cloud infrastructure and complex container orchestrators.
In this quickstart, you create a secure Container Apps environment and deploy your first container app. ## Prerequisites -- Azure account with an active subscription.
+- An Azure account with an active subscription.
- If you don't have one, you [can create one for free](https://azure.microsoft.com/free/). - Install the [Azure CLI](/cli/azure/install-azure-cli).
az containerapp env create \
--resource-group $RESOURCE_GROUP \ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
+ --location $LOCATION
``` # [PowerShell](#tab/powershell)
az containerapp env create `
--resource-group $RESOURCE_GROUP ` --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID ` --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
+ --location $LOCATION
``` ## Create a container app
-Now that you have an environment created, you can deploy your first container app. Using the `containerapp create` command, deploy a container image to Azure Container Apps.
+Now that you have an environment created, you can deploy your first container app. With the `containerapp create` command, deploy a container image to Azure Container Apps.
# [Bash](#tab/bash)
By setting `--ingress` to `external`, you make the container app available to pu
## Verify deployment
-The `create` command returned the container app's fully qualified domain name. Copy this location to a web browser and you'll see the following message.
+The `create` command returned the fully qualified domain name for the container app. Copy this location to a web browser and see the following message:
:::image type="content" source="media/get-started/azure-container-apps-quickstart.png" alt-text="Your first Azure Container Apps deployment."::: ## Clean up resources
-If you're not going to continue to use this application, you can delete the Azure Container Apps instance and all the associated services by removing the resource group.
+If you're not going to continue to use this application, run the following command to delete the resource group along with all the resources created in this quickstart.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --name $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
container-apps Microservices Dapr Azure Resource Manager https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/microservices-dapr-azure-resource-manager.md
Title: 'Tutorial: Deploy a Dapr application to Azure Container Apps using an ARM or Bicep template'
-description: Deploy a Dapr application to Azure Container Apps using an ARM or Bicep template.
+ Title: 'Tutorial: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template'
+description: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template.
Previously updated : 11/02/2021 Last updated : 01/31/2022 zone_pivot_groups: container-apps
-# Tutorial: Deploy a Dapr application to Azure Container Apps using an ARM or Bicep template
+# Tutorial: Deploy a Dapr application to Azure Container Apps with an Azure Resource Manager or Bicep template
-[Dapr](https://dapr.io/) (Distributed Application Runtime) is a runtime that helps you build resilient stateless and stateful microservices. In this tutorial, a sample Dapr application is deployed to Azure Container Apps.
+[Dapr](https://dapr.io/) (Distributed Application Runtime) is a runtime that helps you build resilient stateless and stateful microservices. In this tutorial, a sample Dapr application is deployed to Azure Container Apps via an Azure Resource Manager (ARM) or Bicep template.
You learn how to: > [!div class="checklist"]+ > * Create a Container Apps environment for your container apps > * Create an Azure Blob Storage state store for the container app
-> * Deploy two apps that a produce and consume messages and persist them using the state store
+> * Deploy two apps that a produce and consume messages and persist them with the state store
> * Verify the interaction between the two microservices.
-Azure Container Apps offers a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
+With Azure Container Apps, you get a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
+
+In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart.
+
+The application consists of:
-In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart. The quickstart consists of a client (Python) app that generates messages, and a service (Node) app that consumes and persists those messages in a configured state store. The following architecture diagram illustrates the components that make up this tutorial:
+* A client (Python) container app to generate messages.
+* A service (Node) container app to consume and persist those messages in a state store
+
+The following architecture diagram illustrates the components that make up this tutorial:
:::image type="content" source="media/microservices-dapr/azure-container-apps-microservices-dapr.png" alt-text="Architecture diagram for Dapr Hello World microservices on Azure Container Apps"::: ## Prerequisites
-* [Azure CLI](/cli/azure/install-azure-cli)
+* Install [Azure CLI](/cli/azure/install-azure-cli)
::: zone pivot="container-apps-bicep"
In this tutorial, you deploy the same applications from the Dapr [Hello World](h
## Before you begin
-This guide makes use of the following environment variables:
+This guide uses the following environment variables:
# [Bash](#tab/bash)
$STORAGE_ACCOUNT_CONTAINER="mycontainer"
-The above snippet can be used to set the environment variables using bash, zsh, or PowerShell.
# [Bash](#tab/bash)
$STORAGE_ACCOUNT="<storage account name>"
-Choose a name for `STORAGE_ACCOUNT`. It will be created in a following step. Storage account names must be *unique within Azure*. It must be between 3 and 24 characters in length and may contain numbers and lowercase letters only.
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure*. Be from 3 to 24 characters in length and contain numbers and lowercase letters only.
## Setup
-Begin by signing in to Azure.
-
-Run the following command, and follow the prompts to complete the authentication process.
+First, sign in to Azure.
# [Bash](#tab/bash)
az upgrade
-Next, install the Azure Container Apps extension to the CLI.
+Next, install the Azure Container Apps extension for the Azure CLI.
# [Bash](#tab/bash)
With the CLI upgraded and a new resource group available, you can create a Conta
## Create an environment
-Azure Container Apps environments act as isolation boundaries between a group of container apps. Container Apps deployed to the same environment share the same virtual network and write logs to the same Log Analytics workspace.
+The Azure Container Apps environment acts as a secure boundary around a group of container apps. Container Apps deployed to the same environment share a virtual network and write logs to the same Log Analytics workspace.
-Azure Log Analytics is used to monitor your container app and is required when creating a Container Apps environment.
+Your container apps are monitored with Azure Log Analytics, which is required when you create a Container Apps environment.
-Create a new Log Analytics workspace with the following command:
+Create a Log Analytics workspace with the following command:
# [Bash](#tab/bash)
az containerapp env create `
### Create an Azure Blob Storage account
-Use the following command to create a new Azure Storage account.
+Use the following command to create an Azure Storage account.
# [Bash](#tab/bash)
New-AzStorageAccount -ResourceGroupName $RESOURCE_GROUP `
Once your Azure Blob Storage account is created, the following values are needed for subsequent steps in this tutorial.
-* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable you chose above.
+* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable.
-* `storage_container_name` is the value of `STORAGE_ACCOUNT_CONTAINER` defined above (for example, `mycontainer`). Dapr creates a container with this name if it doesn't already exist in your Azure Storage account.
+* `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER`variable.
-Get the storage account key with the following command.
+Dapr creates a container with this name when it doesn't already exist in your Azure Storage account.
+
+Get the storage account key with the following command:
# [Bash](#tab/bash)
$STORAGE_ACCOUNT_KEY=(Get-AzStorageAccountKey -ResourceGroupName $RESOURCE_GROUP
### Create Azure Resource Manager (ARM) templates
-Create two Azure Resource Manager (ARM) templates.
+Create two ARM templates.
-The ARM template has the Container App definition and a Dapr component definition.
+Each ARM template has a container app definition and a Dapr component definition.
The following example shows how your ARM template should look when configured for your Azure Blob Storage account.
Save the following file as *serviceapp.json*:
Create two Bicep templates.
-The Bicep template has the Container App definition and a Dapr component definition.
+Each Bicep template contains a container app definition and a Dapr component definition.
The following example shows how your Bicep template should look when configured for your Azure Blob Storage account.
resource pythonapp 'Microsoft.Web/containerApps@2021-03-01' = {
::: zone pivot="container-apps-arm"
-Now let's deploy the service Container App. Navigate to the directory in which you stored the ARM template file and run the command below.
+Now deploy the service Container App. Navigate to the directory in which you stored the ARM template file and run the following command:
# [Bash](#tab/bash)
New-AzResourceGroupDeployment `
::: zone pivot="container-apps-bicep"
-Now let's deploy the service Container App. Navigate to the directory in which you stored the Bicep template file and run the command below.
+Now deploy the service container. Navigate to the directory in which you stored the Bicep template file and run the following command:
-A warning (BCP081) may be displayed. This warning will have no effect on successfully deploying the Container App.
+A warning (BCP081) might be displayed. This warning has no effect on the successful deployment of the application.
# [Bash](#tab/bash)
New-AzResourceGroupDeployment `
-This command deploys the service (Node) app server on `targetPort: 3000` (the app's port) along with its accompanying Dapr sidecar configured with `"appId": "nodeapp",` and dapr `"appPort": 3000,` for service discovery and invocation. Your state store is configured using the `components` object of `"type": "state.azure.blobstorage"`, which enables the sidecar to persist state.
+This command deploys:
+
+* the service (Node) app server on `targetPort: 3000` (the app port)
+* its accompanying Dapr sidecar configured with `"appId": "nodeapp",` and dapr `"appPort": 3000,` for service discovery and invocation.
+
+Your state store is configured with the `components` object of `"type": "state.azure.blobstorage"`, which enables the sidecar to persist state.
## Deploy the client application (headless client)
-Run the command below to deploy the client container app.
+Run the following command to deploy the client container.
::: zone pivot="container-apps-arm"
New-AzResourceGroupDeployment `
::: zone pivot="container-apps-bicep"
-A warning (BCP081) may be displayed. This warning will have no effect on successfully deploying the Container App.
+A warning (BCP081) might be displayed. This warning has no effect on the successful deployment of the application.
# [Bash](#tab/bash)
This command deploys `pythonapp` that also runs with a Dapr sidecar that is used
### Confirm successful state persistence
-You can confirm the services are working correctly by viewing data in your Azure Storage account.
+You can confirm that the services are working correctly by viewing data in your Azure Storage account.
1. Open the [Azure portal](https://portal.azure.com) in your browser and navigate to your storage account.
-1. Select **Containers** on the left.
+1. Select **Containers** from the menu on the left side.
1. Select **mycontainer**. 1. Verify that you can see the file named `order` in the container.
-1. Click on the file.
+1. Select on the file.
-1. Click the **Edit** tab.
+1. Select the **Edit** tab.
-1. Click the **Refresh** button to observe updates.
+1. Select the **Refresh** button to observe updates.
### View Logs
-Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or from the command line. You may need to wait a few minutes for the analytics to arrive for the first time before you can query the logged data.
+Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or from the command line. Wait a few minutes for the analytics to arrive for the first time before you query the logged data.
Use the following command to view logs in bash or PowerShell.
nodeapp Successfully persisted state. PrimaryResult 2021-10-22
nodeapp Got a new order! Order ID: 63 PrimaryResult 2021-10-22T22:45:44.618Z ```
-> [!TIP]
-> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
- ## Clean up resources
-Once you're done, clean up your Container App resources by running the following command to delete your resource group.
+Once you are done, run the following command to delete your resource group along with all the resources you created in this tutorial.
# [Bash](#tab/bash)
Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
-This command deletes both container apps, the storage account, the container apps environment, and any other resources in the resource group.
- > [!NOTE] > Since `pythonapp` continuously makes calls to `nodeapp` with messages that get persisted into your configured state store, it is important to complete these cleanup steps to avoid ongoing billable operations. ++
+> [!TIP]
+> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
+ ## Next steps > [!div class="nextstepaction"]
container-apps Microservices Dapr https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/microservices-dapr.md
You learn how to: > [!div class="checklist"]+ > * Create a Container Apps environment for your container apps > * Create an Azure Blob Storage state store for the container app
-> * Deploy two apps that produce and consume messages and persist them using the state store
+> * Deploy two apps that produce and consume messages and persist them in the state store
> * Verify the interaction between the two microservices.
-Azure Container Apps offers a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
-
-In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart, which consists of a client (Python) app that generates messages, and a service (Node) app that consumes and persists those messages in a configured state store. The following architecture diagram illustrates the components that make up this tutorial:
--
-## Prerequisites
-
-* [Azure CLI](/cli/azure/install-azure-cli)
-
-## Before you begin
-
-This guide makes use of the following environment variables:
-
-# [Bash](#tab/bash)
-
-```bash
-RESOURCE_GROUP="my-containerapps"
-LOCATION="canadacentral"
-CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-STORAGE_ACCOUNT_CONTAINER="mycontainer"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$RESOURCE_GROUP="my-containerapps"
-$LOCATION="canadacentral"
-$CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-$LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-$STORAGE_ACCOUNT_CONTAINER="mycontainer"
-```
---
-The above snippet can be used to set the environment variables using bash, zsh, or PowerShell.
+With Azure Container Apps, you get a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
-# [Bash](#tab/bash)
+In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart.
-```bash
-STORAGE_ACCOUNT="<storage account name>"
-```
+The application consists of:
-# [PowerShell](#tab/powershell)
-
-```powershell
-$STORAGE_ACCOUNT="<storage account name>"
-```
+* a client (Python) app that generates messages
+* a service (Node) app that consumes and persists those messages in a configured state store
--
-Choose a name for `STORAGE_ACCOUNT`. It will be created in a following step. Storage account names must be *unique within Azure* and between 3 and 24 characters in length and may contain numbers and lowercase letters only.
-
-## Setup
-
-Begin by signing in to Azure from the CLI.
-
-Run the following command, and follow the prompts to complete the authentication process.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az login
-```
+The following architecture diagram illustrates the components that make up this tutorial:
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az login
-```
---
-Ensure you're running the latest version of the CLI via the upgrade command.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az upgrade
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az upgrade
-```
---
-Next, install the Azure Container Apps extension to the CLI.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az extension add \
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az extension add `
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
---
-Now that the extension is installed, register the `Microsoft.Web` namespace.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az provider register --namespace Microsoft.Web
-```
-
-# [PowerShell](#tab/powershell)
-```azurecli
-az provider register --namespace Microsoft.Web
-```
-Create a resource group to organize the services related to your new container app.
+Individual container apps are deployed to an Azure Container Apps environment. To create the environment, run the following command:
# [Bash](#tab/bash) ```azurecli
-az group create \
- --name $RESOURCE_GROUP \
+az containerapp env create \
+ --name $CONTAINERAPPS_ENVIRONMENT \
+ --resource-group $RESOURCE_GROUP \
+ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
+ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location "$LOCATION" ``` # [PowerShell](#tab/powershell) ```azurecli
-az group create `
- --name $RESOURCE_GROUP `
+az containerapp env create `
+ --name $CONTAINERAPPS_ENVIRONMENT `
+ --resource-group $RESOURCE_GROUP `
+ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
+ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location "$LOCATION" ```
-With the CLI upgraded and a new resource group available, you can create a Container Apps environment and deploy your container app.
-
-## Create an environment
-
-Azure Container Apps environments act as isolation boundaries between a group of container apps. Container Apps deployed to the same environment are deployed in the same virtual network and write logs to the same Log Analytics workspace.
-
-Azure Log Analytics is used to monitor your container app and is required when creating a Container Apps environment.
-
-Create a new Log Analytics workspace with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az monitor log-analytics workspace create \
- --resource-group $RESOURCE_GROUP \
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
+## Set up a state store
-# [PowerShell](#tab/powershell)
+### Create an Azure Blob Storage account
-```azurecli
-az monitor log-analytics workspace create `
- --resource-group $RESOURCE_GROUP `
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
-
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure*, from 3 to 24 characters in length and must contain numbers and lowercase letters only.
-Next, retrieve the Log Analytics Client ID and client secret.
# [Bash](#tab/bash)
-Make sure to run each query separately to give enough time for the request to complete.
- ```bash
-LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv`
-```
-
-```bash
-LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv`
+STORAGE_ACCOUNT="<storage account name>"
``` # [PowerShell](#tab/powershell)
-Make sure to run each query separately to give enough time for the request to complete.
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv)
-```
- ```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=(az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv)
+$STORAGE_ACCOUNT="<storage account name>"
```
-Individual container apps are deployed to an Azure Container Apps environment. To create the environment, run the following command:
+Set the `STORAGE_ACCOUNT_CONTAINER` name.
# [Bash](#tab/bash)
-```azurecli
-az containerapp env create \
- --name $CONTAINERAPPS_ENVIRONMENT \
- --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
+```bash
+STORAGE_ACCOUNT_CONTAINER="mycontainer"
``` # [PowerShell](#tab/powershell)
-```azurecli
-az containerapp env create `
- --name $CONTAINERAPPS_ENVIRONMENT `
- --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
+```powershell
+$STORAGE_ACCOUNT_CONTAINER="mycontainer"
```
-## Set up a state store
-
-### Create an Azure Blob Storage account
-
-Use the following command to create a new Azure Storage account.
+Use the following command to create an Azure Storage account.
# [Bash](#tab/bash)
az storage account create \
# [PowerShell](#tab/powershell)
-```azurecli
-az storage account create `
- --name $STORAGE_ACCOUNT `
- --resource-group $RESOURCE_GROUP `
- --location "$LOCATION" `
- --sku Standard_RAGRS `
- --kind StorageV2
+```powershell
+New-AzStorageAccount -ResourceGroupName $RESOURCE_GROUP `
+ -Name $STORAGE_ACCOUNT `
+ -Location $LOCATION `
+ -SkuName Standard_RAGRS
``` Once your Azure Blob Storage account is created, the following values are needed for subsequent steps in this tutorial.
-* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable you chose above.
+* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable that you set previously.
-* `storage_container_name` is the value of `STORAGE_ACCOUNT_CONTAINER` defined above (for example, `mycontainer`). Dapr creates a container with this name if it doesn't already exist in your Azure Storage account.
+* `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER` variable. Dapr creates a container with this name when it doesn't already exist in your Azure Storage account.
-Get the storage account key with the following command.
+Get the storage account key with the following command:
# [Bash](#tab/bash)
echo $STORAGE_ACCOUNT_KEY
# [PowerShell](#tab/powershell) ```powershell
-$STORAGE_ACCOUNT_KEY=(az storage account keys list --resource-group $RESOURCE_GROUP --account-name $STORAGE_ACCOUNT --query '[0].value' --out tsv)
+$STORAGE_ACCOUNT_KEY=(Get-AzStorageAccountKey -ResourceGroupName $RESOURCE_GROUP -AccountName $STORAGE_ACCOUNT)| Where-Object -Property KeyName -Contains 'key1' | Select-Object -ExpandProperty Value
``` ```powershell
echo $STORAGE_ACCOUNT_KEY
### Configure the state store component
-Using the properties you sourced from the steps above, create a config file named *components.yaml*. This file helps enable your Dapr app to access your state store. The following example shows how your *components.yaml* file should look when configured for your Azure Blob Storage account:
+Create a config file named *components.yaml* with the properties that you sourced from the previous steps. This file helps enable your Dapr app to access your state store. The following example shows how your *components.yaml* file should look when configured for your Azure Blob Storage account:
```yaml # components.yaml for Azure Blob storage component
To use this file, make sure to replace the placeholder values between the `<>` b
## Deploy the service application (HTTP web server)
-Navigate to the directory in which you stored the *components.yaml* file and run the command below to deploy the service container app.
+Navigate to the directory in which you stored the *components.yaml* file and run the following command to deploy the service container app.
# [Bash](#tab/bash)
az containerapp create `
-This command deploys the service (Node) app server on `--target-port 3000` (the app's port) along with its accompanying Dapr sidecar configured with `--dapr-app-id nodeapp` and `--dapr-app-port 3000` for service discovery and invocation. Your state store is configured using `--dapr-components ./components.yaml`, which enables the sidecar to persist state.
+This command deploys:
+
+* the service (Node) app server on `--target-port 3000` (the app port)
+* its accompanying Dapr sidecar configured with `--dapr-app-id nodeapp` and `--dapr-app-port 3000` for service discovery and invocation
+
+Your state store is configured using `--dapr-components ./components.yaml`, which enables the sidecar to persist state.
## Deploy the client application (headless client)
-Run the command below to deploy the client container app.
+Run the following command to deploy the client container app.
# [Bash](#tab/bash)
This command deploys `pythonapp` that also runs with a Dapr sidecar that is used
### Confirm successful state persistence
-You can confirm the services are working correctly by viewing data in your Azure Storage account.
+You can confirm that the services are working correctly by viewing data in your Azure Storage account.
1. Open the [Azure portal](https://portal.azure.com) in your browser and navigate to your storage account.
-1. Select **Containers** on the left.
+1. Select **Containers** left side menu.
1. Select **mycontainer**. 1. Verify that you can see the file named `order` in the container.
-1. Click on the file.
+1. Select on the file.
-1. Click the **Edit** tab.
+1. Select the **Edit** tab.
-1. Click the **Refresh** button to observe how the data automatically updates.
+1. Select the **Refresh** button to observe how the data automatically updates.
### View Logs
-Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or with the CLI. You may need to wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
+Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or with the CLI. Wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
Use the following CLI command to view logs on the command line.
az monitor log-analytics query \
# [PowerShell](#tab/powershell)
-```azurecli
-az monitor log-analytics query `
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" `
- --out table
+```powershell
+$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5"
+$queryResults.Results
```
nodeapp Successfully persisted state. PrimaryResult 2021-10-22
nodeapp Got a new order! Order ID: 63 PrimaryResult 2021-10-22T22:45:44.618Z ```
-> [!TIP]
-> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
- ## Clean up resources
-Once you are done, clean up your Container App resources by running the following command to delete your resource group.
+Once you are done, run the following command to delete your resource group along with all the resources you created in this tutorial.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --resource-group $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
-This command deletes both container apps, the storage account, the container apps environment, and any other resources in the resource group.
-
-> [!NOTE]
+This command deletes the resource group that includes all of the resources created in this tutorial.
+ [!NOTE]
> Since `pythonapp` continuously makes calls to `nodeapp` with messages that get persisted into your configured state store, it is important to complete these cleanup steps to avoid ongoing billable operations.
+> [!TIP]
+> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
+ ## Next steps > [!div class="nextstepaction"]
container-registry Buffer Gate Public Content https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/buffer-gate-public-content.md
Title: Manage public content in private container registry
description: Practices and workflows in Azure Container Registry to manage dependencies on public images from Docker Hub and other public content - Previously updated : 06/17/2021+ Last updated : 02/01/2022 # Manage public content with Azure Container Registry
As a recommended one-time step, [import](container-registry-import-images.md) ba
`az acr import` doesn't require a local Docker installation. You can run it with a local installation of the Azure CLI or directly in Azure Cloud Shell. It supports images of any OS type, multi-architecture images, or OCI artifacts such as Helm charts.
+Depending on your organization's needs, you can import to a dedicated registry or a repository in a shared registry.
+
+# [Azure CLI](#tab/azure-cli)
Example: ```azurecli-interactive
az acr import \
--password <Docker Hub token> ```
-Depending on your organization's needs, you can import to a dedicated registry or a repository in a shared registry.
+# [PowerShell](#tab/azure-powershell)
+Example:
+
+```azurepowershell-interactive
+Import-AzContainerRegistryImage
+ -SourceImage library/busybox:latest
+ -ResourceGroupName $resourceGroupName
+ -RegistryName $RegistryName
+ -SourceRegistryUri docker.io
+ -TargetTag busybox:latest
+```
+ Credentials are required if the source registry is not available publicly or the admin user is disabled.
## Update image references
data-factory Author Global Parameters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/author-global-parameters.md
Previously updated : 05/12/2021 Last updated : 01/31/2022
After a global parameter is created, you can edit it by clicking the parameter's
:::image type="content" source="media/author-global-parameters/create-global-parameter-3.png" alt-text="Create global parameters":::
+Global parameters are stored as part of the /factory/{factory_name}-arm-template parameters.json.
+ ## Using global parameters in a pipeline Global parameters can be used in any [pipeline expression](control-flow-expression-language-functions.md). If a pipeline is referencing another resource such as a dataset or data flow, you can pass down the global parameter value via that resource's parameters. Global parameters are referenced as `pipeline().globalParameters.<parameterName>`.
$globalParametersJson = Get-Content $globalParametersFilePath
Write-Host "Parsing JSON..." $globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
-foreach ($gp in $globalParametersObject.GetEnumerator()) {
+foreach ($gp in $factoryFileObject.properties.globalParameters.GetEnumerator()) {
+ # foreach ($gp in $globalParametersObject.GetEnumerator()) {
Write-Host "Adding global parameter:" $gp.Key $globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]) $newGlobalParameters.Add($gp.Key, $globalParameterValue)
data-factory Data Factory Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-factory-troubleshoot-guide.md
Previously updated : 09/30/2021 Last updated : 01/28/2022
For more information, see [Getting started with Fiddler](https://docs.telerik.co
## General
+### REST continuation token NULL error
+
+**Error message:** {\"token\":null,\"range\":{\"min\":\..}
+
+**Cause:** When querying across multiple partitions/pages, backend service returns continuation token in JObject format with 3 properties: **token, min and max key ranges**, for instance, {\"token\":null,\"range\":{\"min\":\"05C1E9AB0DAD76\",\"max":\"05C1E9CD673398"}}). Depending on source data, querying can result 0 indicating missing token though there is more data to fetch.
+
+**Recommendation:** When the continuationToken is non-null, as the string {\"token\":null,\"range\":{\"min\":\"05C1E9AB0DAD76\",\"max":\"05C1E9CD673398"}}, it is required to call queryActivityRuns API again with the continuation token from the previous response. You need to pass the full string for the query API again. The activities will be returned in the subsequent pages for the query result. You should ignore that there is empty array in this page, as long as the full continuationToken value != null, you need continue querying. For more details, please refer to [REST api for pipeline run query.](/rest/api/datafactory/activity-runs/query-by-pipeline-run)
++ ### Activity stuck issue When you observe that the activity is running much longer than your normal runs with barely no progress, it may happen to be stuck. You can try canceling it and retry to see if it helps. If it's a copy activity, you can learn about the performance monitoring and troubleshooting from [Troubleshoot copy activity performance](copy-activity-performance-troubleshooting.md); if it's a data flow, learn from [Mapping data flows performance](concepts-data-flow-performance.md) and tuning guide.
For more troubleshooting help, try these resources:
* [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory) * [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory) * [Azure videos](https://azure.microsoft.com/resources/videos/index/)
-* [Microsoft Q&A question page](/answers/topics/azure-data-factory.html)
+* [Microsoft Q&A question page](/answers/topics/azure-data-factory.html)
data-factory Data Flow Expression Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-expression-functions.md
___
<a name="fromBase64" ></a> ### <code>fromBase64</code>
-<code><b>fromBase64(<i>&lt;value1&gt;</i> : string) => string</b></code><br/><br/>
-Decodes the given base64-encoded string.
+<code><b>fromBase64(<i>&lt;value1&gt;</i> : string, <i>&lt;encoding type&gt;</i> : string) => string</b></code><br/><br/>
+Decodes the given base64-encoded string. You can optionally pass the encoding type.
* ``fromBase64('Z3VuY2h1cw==') -> 'gunchus'``
+* ``fromBase64('SGVsbG8gV29ybGQ=', 'Windows-1252') -> 'Hello World'``
___
___
<a name="toBase64" ></a> ### <code>toBase64</code>
-<code><b>toBase64(<i>&lt;value1&gt;</i> : string) => string</b></code><br/><br/>
-Encodes the given string in base64.
-* ``toBase64('bojjus') -> 'Ym9qanVz'``
-___
+<code><b>toBase64(<i>&lt;value1&gt;</i> : string, <i>&lt;encoding type&gt;</i> : string]) => string</b></code><br/><br/>
+Encodes the given string in base64. You can optionally pass the encoding type
+* ``toBase64('bojjus') -> 'Ym9qanVz'``
+* ``toBase64('┬▒ 25000, Γé¼ 5.000,- |', 'Windows-1252') -> 'sSAyNTAwMCwggCA1LjAwMCwtIHw='``
+___
<a name="toBinary" ></a>
___
<a name="toString" ></a> ### <code>toString</code>
-<code><b>toString(<i>&lt;value&gt;</i> : any, [<i>&lt;number format/date format&gt;</i> : string]) => string</b></code><br/><br/>
-Converts a primitive datatype to a string. For numbers and date a format can be specified. If unspecified the system default is picked.Java decimal format is used for numbers. Refer to Java SimpleDateFormat for all possible date formats; the default format is yyyy-MM-dd.
+<code><b>toString(<i>&lt;value&gt;</i> : any, [<i>&lt;number format/date format&gt;</i> : string], [<i>&lt;date locale&gt;</i> : string]) => string</b></code><br/><br/>
+Converts a primitive datatype to a string. For numbers and date a format can be specified. If unspecified the system default is picked.Java decimal format is used for numbers. Refer to Java SimpleDateFormat for all possible date formats; the default format is yyyy-MM-dd. For date or timestamp a locale can be optionally specified.
* ``toString(10) -> '10'`` * ``toString('engineer') -> 'engineer'`` * ``toString(123456.789, '##,###.##') -> '123,456.79'``
Converts a primitive datatype to a string. For numbers and date a format can be
* ``toString(toDate('2018-12-31')) -> '2018-12-31'`` * ``isNull(toString(toDate('2018-12-31', 'MM/dd/yy'))) -> true`` * ``toString(4 == 20) -> 'false'``
-___
-
+* ``toString(toDate('12/31/18', 'MM/dd/yy', 'es-ES'), 'MM/dd/yy', 'de-DE')``
+ ___
<a name="toTimestamp" ></a>
data-factory Data Flow Troubleshoot Errors https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-troubleshoot-errors.md
Previously updated : 10/01/2021 Last updated : 01/21/2022 # Common error codes and messages
This article lists common error codes and messages reported by mapping data flow
If you're running the data flow in a debug test execution from a debug pipeline run, you might run into this condition more frequently. That's because Azure Data Factory throttles the broadcast timeout to 60 seconds to maintain a faster debugging experience. You can extend the timeout to the 300-second timeout of a triggered run. To do so, you can use the **Debug** > **Use Activity Runtime** option to use the Azure IR defined in your Execute Data Flow pipeline activity. -- **Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
+- **Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance, then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
- **Cause**: Broadcast has a default timeout of 60 seconds in debug runs and 300 seconds in job runs. On the broadcast join, the stream chosen for broadcast is too large to produce data within this limit. If a broadcast join isn't used, the default broadcast by dataflow can reach the same limit. - **Recommendation**: Turn off the broadcast option or avoid broadcasting large data streams for which the processing can take more than 60 seconds. Choose a smaller stream to broadcast. Large Azure SQL Data Warehouse tables and source files aren't typically good choices. In the absence of a broadcast join, use a larger cluster if this error occurs.
This article lists common error codes and messages reported by mapping data flow
- **Recommendation**: Set an alias if you're using a SQL function like min() or max(). ## Error code: DF-Executor-DriverError-- **Message**: INT96 is legacy timestamp type which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.
+- **Message**: INT96 is legacy timestamp type, which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.
- **Cause**: Driver error. - **Recommendation**: INT96 is a legacy timestamp type that's not supported by Azure Data Factory data flow. Consider upgrading the column type to the latest type.
This article lists common error codes and messages reported by mapping data flow
- **Recommendation**: Contact the Microsoft product team for more details about this problem. ## Error code: DF-Executor-PartitionDirectoryError-- **Message**: The specified source path has either multiple partitioned directories (for e.g. &lt;Source Path&gt;/<Partition Root Directory 1>/a=10/b=20, &lt;Source Path&gt;/&lt;Partition Root Directory 2&gt;/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example &lt;Source Path&gt;/&lt;Partition Root Directory 1&gt;/a=10/b=20, &lt;Source Path&gt;/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.
+- **Message**: The specified source path has either multiple partitioned directories (for example, &lt;Source Path&gt;/<Partition Root Directory 1>/a=10/b=20, &lt;Source Path&gt;/&lt;Partition Root Directory 2&gt;/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example &lt;Source Path&gt;/&lt;Partition Root Directory 1&gt;/a=10/b=20, &lt;Source Path&gt;/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.
- **Cause**: The source path has either multiple partitioned directories or a partitioned directory that has another file or non-partitioned directory. - **Recommendation**: Remove the partitioned root directory from the source path and read it through separate source transformation.
This article lists common error codes and messages reported by mapping data flow
## Error code: InvalidTemplate - **Message**: The pipeline expression cannot be evaluated. - **Cause**: The pipeline expression passed in the Data Flow activity isn't being processed correctly because of a syntax error.-- **Recommendation**: Check your activity in activity monitoring to verify the expression.
+- **Recommendation**: Check data flow activity name. Check expressions in activity monitoring to verify the expressions. For example, data flow activity name can not have a space or a hyphen.
## Error code: 2011 - **Message**: The activity was running on Azure Integration Runtime and failed to decrypt the credential of data store or compute connected via a Self-hosted Integration Runtime. Please check the configuration of linked services associated with this activity, and make sure to use the proper integration runtime type.
This article lists common error codes and messages reported by mapping data flow
## Error code: DF-Hive-InvalidBlobStagingConfiguration - **Message**: Blob storage staging properties should be specified. - **Cause**: An invalid staging configuration is provided in the Hive.-- **Recommendation**: Please check if the account key, account name and container are set properly in the related Blob linked service which is used as staging.
+- **Recommendation**: Please check if the account key, account name and container are set properly in the related Blob linked service, which is used as staging.
## Error code: DF-Hive-InvalidGen2StagingConfiguration - **Message**: ADLS Gen2 storage staging only support service principal key credential.
For more help with troubleshooting, see these resources:
- [Data Factory feature requests](/answers/topics/azure-data-factory.html) - [Azure videos](https://azure.microsoft.com/resources/videos/index/?sort=newest&services=data-factory) - [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)-- [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
+- [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
data-factory Solution Template Copy New Files Lastmodifieddate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/solution-template-copy-new-files-lastmodifieddate.md
Previously updated : 3/8/2019 Last updated : 01/31/2022 # Copy new and changed files by LastModifiedDate with Azure Data Factory
The template defines six parameters:
1. Go to template **Copy new files only by LastModifiedDate**. Create a **New** connection to your source storage store. The source storage store is where you want to copy files from.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate1.png" alt-text="Create a new connection to the source":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-1.png" alt-text="Create a new connection to the source":::
2. Create a **New** connection to your destination store. The destination store is where you want to copy files to.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate3.png" alt-text="Create a new connection to the destination":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-3.png" alt-text="Create a new connection to the destination":::
3. Select **Use this template**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate4.png" alt-text="Use this template":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-4.png" alt-text="Use this template":::
4. You will see the pipeline available in the panel, as shown in the following example:
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate5.png" alt-text="Show the pipeline":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-5.png" alt-text="Show the pipeline":::
5. Select **Debug**, write the value for the **Parameters** and select **Finish**. In the picture below, we set the parameters as following. - **FolderPath_Source** = sourcefolder
The template defines six parameters:
The example is indicating that the files, which have been last modified within the timespan (**2019-02-01T00:00:00Z** to **2019-03-01T00:00:00Z**) will be copied from the source path **sourcefolder/subfolder** to the destination path **destinationfolder/subfolder**. You can replace these with your own parameters.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate6.png" alt-text="Run the pipeline":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-6.png" alt-text="Run the pipeline":::
6. Review the result. You will see only the files last modified within the configured timespan has been copied to the destination store.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate7.png" alt-text="Review the result":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-7.png" alt-text="Review the result":::
7. Now you can add a tumbling windows trigger to automate this pipeline, so that the pipeline can always copy new and changed files only by LastModifiedDate periodically. Select **Add trigger**, and select **New/Edit**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate8.png" alt-text="Screenshot that highlights the New/Edit menu option that appears when you select Add trigger.":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-8.png" alt-text="Screenshot that highlights the New/Edit menu option that appears when you select Add trigger.":::
8. In the **Add Triggers** window, select **+ New**. 9. Select **Tumbling Window** for the trigger type, set **Every 15 minute(s)** as the recurrence (you can change to any interval time). Select **Yes** for Activated box, and then select **OK**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate10.png" alt-text="Create trigger":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-10.png" alt-text="Create trigger":::
10. Set the value for the **Trigger Run Parameters** as following, and select **Finish**. - **FolderPath_Source** = **sourcefolder**. You can replace with your folder in source data store.
The template defines six parameters:
- **LastModified_From** = **\@trigger().outputs.windowStartTime**. It is a system variable from the trigger determining the time when the pipeline was triggered last time. - **LastModified_To** = **\@trigger().outputs.windowEndTime**. It is a system variable from the trigger determining the time when the pipeline is triggered this time.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate11.png" alt-text="Input parameters":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-11.png" alt-text="Input parameters":::
11. Select **Publish All**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate12.png" alt-text="Publish All":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-12.png" alt-text="Publish All":::
12. Create new files in your source folder of data source store. You are now waiting for the pipeline to be triggered automatically and only the new files will be copied to the destination store.
The template defines six parameters:
14. Review the result. You will see your pipeline will be triggered automatically every 15 minutes, and only the new or changed files from source store will be copied to the destination store in each pipeline run.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate15.png" alt-text="Screenshot that shows the results that return when the pipeline is triggered.":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-15.png" alt-text="Screenshot that shows the results that return when the pipeline is triggered.":::
## Next steps
databox Data Box Customer Managed Encryption Key Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox/data-box-customer-managed-encryption-key-portal.md
To enable a customer-managed key for your existing Data Box order in the Azure p
![Customer-managed key URL](./media/data-box-customer-managed-encryption-key-portal/customer-managed-key-11.png) > [!IMPORTANT]
-> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault?view=azure-cli-latest#az_keyvault_set_policy).
+> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault#az_keyvault_set_policy).
## Change key
To change the key vault, key, and/or key version for the customer-managed key yo
![Save updated encryption settings - 1](./media/data-box-customer-managed-encryption-key-portal/customer-managed-key-17-a.png) > [!IMPORTANT]
-> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault?view=azure-cli-latest#az_keyvault_set_policy).
+> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault#az_keyvault_set_policy).
## Change identity
defender-for-cloud Supported Machines Endpoint Solutions Clouds https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-cloud/supported-machines-endpoint-solutions-clouds.md
Title: Microsoft Defender for Cloud's features according to OS, machine type, and cloud description: Learn about the availability of Microsoft Defender for Cloud features according to OS, machine type, and cloud deployment. Previously updated : 12/27/2021 Last updated : 02/01/2022
[!INCLUDE [Banner for top of topics](./includes/banner.md)]
-The two **tabs** below show the features of Microsoft Defender for Cloud that are available for Windows and Linux machines.
+The **tabs** below show the features of Microsoft Defender for Cloud that are available for Windows and Linux machines.
## Supported features for virtual machines and servers <a name="vm-server-features"></a> ### [**Windows machines**](#tab/features-windows)
-| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Enhanced security features required** |
+| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Defender for servers required** |
|--|::|::|::|::| | [Microsoft Defender for Endpoint integration](integration-defender-for-endpoint.md) | Γ£ö</br>(on supported versions) | Γ£ö</br>(on supported versions) | Γ£ö | Yes | | [Virtual machine behavioral analytics (and security alerts)](alerts-reference.md) | Γ£ö | Γ£ö | Γ£ö | Yes |
The two **tabs** below show the features of Microsoft Defender for Cloud that ar
### [**Linux machines**](#tab/features-linux)
-| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Enhanced security features required** |
+| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Defender for servers required** |
|--|::|::|::|::| | [Microsoft Defender for Endpoint integration](integration-defender-for-endpoint.md) | Γ£ö | - | Γ£ö | Yes | | [Virtual machine behavioral analytics (and security alerts)](./azure-defender.md) | Γ£ö</br>(on supported versions) | Γ£ö</br>(on supported versions) | Γ£ö | Yes |
devtest-labs Samples Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/samples-powershell.md
Title: Azure PowerShell Samples
-description: Azure PowerShell Samples - Scripts to help you manage labs in Azure Lab Services
+description: Learn about Azure PowerShell scripts. These samples help you manage labs in Azure Lab Services.
Previously updated : 06/26/2020 Last updated : 02/02/2022 # Azure PowerShell samples for Azure Lab Services
-The following table includes links to sample Azure PowerShell scripts for Azure Lab Services.
+This article includes the sample Azure PowerShell scripts for Azure Lab Services.
+++
+This article includes the following samples:
| Script | Description |
+|- | |
+| [Add an external user to a lab](#add-an-external-user-to-a-lab) | This PowerShell script adds an external user to a lab in Azure DevTest Labs. |
+| [Add marketplace images to a lab](#add-a-marketplace-image-to-a-lab) | This PowerShell script adds marketplace images to a lab in Azure DevTest Labs. |
+| [Create a custom image from a virtual hard drive (VHD)](#create-a-custom-image-from-a-vhd-file) | This PowerShell script creates a custom image in a lab in Azure DevTest Labs. |
+| [Create a custom role in a lab](#create-a-custom-role-in-a-lab) | This PowerShell script creates a custom role in a lab in Azure Lab Services. |
+| [Set allowed virtual machine sizes in a lab](#set-allowed-virtual-machine-sizes) | This PowerShell script sets allowed virtual machine sizes in a lab. |
+
+## Prerequisites
+
+All of these scripts have the following prerequisite:
+
+- An existing lab. If you don't have one, follow this quickstart on how to [Create a lab in Azure portal](devtest-lab-create-lab.md).
+
+## Add an external user to a lab
+
+This sample PowerShell script adds an external user to a lab in Azure DevTest Labs.
++
+This script uses the following commands:
+
+| Command | Notes |
+|||
+| [Get-AzADUser](/powershell/module/az.resources/get-azaduser) | Retries the user object from Azure active directory. |
+| [New-AzRoleAssignment](/powershell/module/az.resources/new-azroleassignment) | Assigns the specified role to the specified principal, at the specified scope. |
+
+## Add a marketplace image to a lab
+
+This sample PowerShell script adds a marketplace image to a lab in Azure DevTest Labs.
++
+This script uses the following commands:
+
+| Command | Notes |
+|||
+| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
+| [Set-AzResource](/powershell/module/az.resources/set-azresource) | Modifies a resource. |
+| [New-AzResource](/powershell/module/az.resources/new-azresource) | Create a resource. |
+
+## Create a custom image from a VHD file
+
+This sample PowerShell script creates a custom image from a VHD file in Azure Lab Services.
++
+This script uses the following commands:
+
+| Command | Notes |
+|||
+| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
+| [Get-AzStorageAccountKey](/powershell/module/az.storage/get-azstorageaccountkey) | Gets the access keys for an Azure Storage account. |
+| [New-AzResourceGroupDeployment](/powershell/module/az.resources/new-azresourcegroupdeployment) | Adds an Azure deployment to a resource group. |
+
+## Create a custom role in a lab
+
+This sample PowerShell script creates a custom role to use in a lab in Azure DevTest Labs.
++
+This script uses the following commands:
+
+| Command | Notes |
|||
-|[Add an external user to a lab](scripts/add-external-user-to-lab.md)| This PowerShell script adds an external user to a lab in Azure DevTest Labs. |
-|[Add marketplace images to a lab](scripts/add-marketplace-images-to-lab.md)| This PowerShell script adds marketplace images to a lab in Azure DevTest Labs. |
-|[Create a custom image from a VHD](scripts/create-custom-image-from-vhd.md)| This PowerShell script creates a custom image in a lab in Azure DevTest Labs. |
-|[Create a custom role in a lab](scripts/create-custom-role-in-lab.md)| This PowerShell script creates a custom role in a lab in Azure Lab Services. |
-|[Set allowed VM sizes in a lab](scripts/set-allowed-vm-sizes-in-lab.md)| This PowerShell script sets allowed virtual machine (VM) sizes in a lab. |
+| [Get-AzProviderOperation](/powershell/module/az.resources/get-azprovideroperation) | Gets the operations for an Azure resource provider that are securable using Azure role-based access control. |
+| [Get-AzRoleDefinition](/powershell/module/az.resources/get-azroledefinition) | Lists all Azure roles that are available for assignment. |
+| [New-AzRoleDefinition](/powershell/module/az.resources/new-azroledefinition) | Creates a custom role. |
+
+## Set allowed virtual machine sizes
+
+This sample PowerShell script sets allowed virtual machine sizes in Azure Lab Services.
++
+| Command | Notes |
+|||
+| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
+| [Set-AzResource](/powershell/module/az.resources/set-azresource) | Modifies a resource. |
+| [New-AzResource](/powershell/module/az.resources/new-azresource) | Create a resource. |
+
+## Next steps
+
+For more information on Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
devtest-labs Add External User To Lab https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/scripts/add-external-user-to-lab.md
- Title: PowerShell - Add an external user to a lab
-description: This article provides an Azure PowerShell script that adds an external user to a lab in Azure DevTest Labs.
- Previously updated : 08/11/2020--
-# Use PowerShell to add an external user to a lab in Azure DevTest Labs
-
-This sample PowerShell script adds an external user to a lab in Azure DevTest Labs.
---
-## Prerequisites
-* **A lab**. The script requires you to have an existing lab.
-
-## Sample script
-
-[!code-powershell[main](../../../powershell_scripts/devtest-lab/add-external-user-to-lab/add-external-user-to-custom-lab.ps1 "Add external user to a lab")]
-
-## Script explanation
-
-This script uses the following commands:
-
-| Command | Notes |
-|||
-| [Get-AzADUser](/powershell/module/az.resources/get-azaduser) | Retries the user object from Azure active directory. |
-| [New-AzRoleAssignment](/powershell/module/az.resources/new-azroleassignment) | Assigns the specified role to the specified principal, at the specified scope. |
-
-## Next steps
-
-For more information on the Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
-
-Additional Azure Lab Services PowerShell script samples can be found in the [Azure Lab Services PowerShell samples](../samples-powershell.md).
devtest-labs Add Marketplace Images To Lab https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/scripts/add-marketplace-images-to-lab.md
- Title: PowerShell - Add a marketplace image to a lab
-description: This PowerShell script adds a marketplace image to a lab in Azure DevTest Labs.
- Previously updated : 08/11/2020--
-# Use PowerShell to add a marketplace image to a lab in Azure DevTest Labs
-
-This sample PowerShell script adds a marketplace image to a lab in Azure DevTest Labs.
---
-## Prerequisites
-* **A lab**. The script requires you to have an existing lab.
-
-## Sample script
-
-[!code-powershell[main](../../../powershell_scripts/devtest-lab/add-marketplace-images-to-lab/add-marketplace-images-to-lab.ps1 "Add marketplace images to a lab")]
-
-## Script explanation
-
-This script uses the following commands:
-
-| Command | Notes |
-|||
-| Find-AzResource | Searches for resources based on specified parameters. |
-| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
-| [Set-AzResource](/powershell/module/az.resources/set-azresource) | Modifies a resource. |
-| [New-AzResource](/powershell/module/az.resources/new-azresource) | Create a resource. |
-
-## Next steps
-
-For more information on the Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
-
-Additional Azure Lab Services PowerShell script samples can be found in the [Azure Lab Services PowerShell samples](../samples-powershell.md).
devtest-labs Create Custom Image From Vhd https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/scripts/create-custom-image-from-vhd.md
- Title: PowerShell - Create custom image from VHD file
-description: This PowerShell script creates a custom image from a VHD file in Azure Lab Services.
- Previously updated : 08/11/2020--
-# Use PowerShell to create a custom image from a VHD file in Azure Lab Services
-
-This sample PowerShell script creates a custom image from a VHD file in Azure Lab Services
---
-## Prerequisites
-* **A lab**. The script requires you to have an existing lab.
-
-## Sample script
-
-[!code-powershell[main](../../../powershell_scripts/devtest-lab/create-custom-image-from-vhd/create-custom-image-from-vhd.ps1 "Add external user to a lab")]
-
-## Script explanation
-
-This script uses the following commands:
-
-| Command | Notes |
-|||
-| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
-| [Get-AzStorageAccountKey](/powershell/module/az.storage/get-azstorageaccountkey) | Gets the access keys for an Azure Storage account. |
-| [New-AzResourceGroupDeployment](/powershell/module/az.resources/new-azresourcegroupdeployment) | Adds an Azure deployment to a resource group. |
-
-## Next steps
-
-For more information on the Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
-
-Additional Azure Lab Services PowerShell script samples can be found in the [Azure Lab Services PowerShell samples](../samples-powershell.md).
devtest-labs Create Custom Role In Lab https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/scripts/create-custom-role-in-lab.md
- Title: PowerShell - Create a custom role in a lab
-description: This article provides an Azure PowerShell script that creats a custom role in a lab in Azure DevTest Labs.
- Previously updated : 08/11/2020--
-# Use PowerShell to create a custom role in a lab in Azure DevTest Labs
-
-This sample PowerShell script creates a custom role to use in a lab in Azure DevTest Labs.
---
-## Prerequisites
-* **A lab**. The script requires you to have an existing lab.
-
-## Sample script
-
-[!code-powershell[main](../../../powershell_scripts/devtest-lab/create-custom-role-in-lab/create-custom-role-in-lab.ps1 "Add external user to a lab")]
-
-## Script explanation
-
-This script uses the following commands:
-
-| Command | Notes |
-|||
-| [Get-AzProviderOperation](/powershell/module/az.resources/get-azprovideroperation) | Gets the operations for an Azure resource provider that are securable using Azure RBAC. |
-| [Get-AzRoleDefinition](/powershell/module/az.resources/get-azroledefinition) | Lists all Azure roles that are available for assignment. |
-| [New-AzRoleDefinition](/powershell/module/az.resources/new-azroledefinition) | Creates a custom role. |
-
-## Next steps
-
-For more information on the Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
-
-Additional Azure Lab Services PowerShell script samples can be found in the [Azure Lab Services PowerShell samples](../samples-powershell.md).
devtest-labs Set Allowed Vm Sizes In Lab https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/scripts/set-allowed-vm-sizes-in-lab.md
- Title: "PowerShell script: Set allowed VM sizes"
-description: This article includes a sample PowerShell script that sets allowed virtual machine (VM) sizes in Azure Lab Services.
- Previously updated : 08/11/2020--
-# Use PowerShell to set allowed VM sizes in Azure Lab Services
-
-This sample PowerShell script sets allowed virtual machine (VM) sizes in Azure Lab Services.
---
-## Prerequisites
-* **A lab**. The script requires you to have an existing lab.
-
-## Sample script
-
-[!code-powershell[main](../../../powershell_scripts/devtest-lab/set-allowed-vm-sizes-in-lab/set-allowed-vm-sizes-in-lab.ps1 "Add external user to a lab")]
-
-## Script explanation
-
-This script uses the following commands:
-
-| Command | Notes |
-|||
-| Find-AzResource | Searches for resources based on specified parameters. |
-| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
-| [Set-AzResource](/powershell/module/az.resources/set-azresource) | Modifies a resource. |
-| [New-AzResource](/powershell/module/az.resources/new-azresource) | Create a resource. |
-
-## Next steps
-
-For more information on the Azure PowerShell, see [Azure PowerShell documentation](/powershell/).
-
-Additional Azure Lab Services PowerShell script samples can be found in the [Azure Lab Services PowerShell samples](../samples-powershell.md).
event-hubs Event Hubs For Kafka Ecosystem Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/event-hubs-for-kafka-ecosystem-overview.md
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule require
> [!NOTE] > When using SAS authentication with Kafka clients, established connections aren't disconnected when the SAS key is regenerated.
+> [!NOTE]
+> [Generated shared access signature tokens](authenticate-shared-access-signature.md#generate-a-shared-access-signature-token) are not supported when using the Event Hubs for Apache Kafka endpoint.
#### Samples For a **tutorial** with step-by-step instructions to create an event hub and access it using SAS or OAuth, see [Quickstart: Data streaming with Event Hubs using the Kafka protocol](event-hubs-quickstart-kafka-enabled-event-hubs.md).
event-hubs Monitor Event Hubs Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/monitor-event-hubs-reference.md
Azure Event Hubs supports the following dimensions for metrics in Azure Monitor.
[!INCLUDE [event-hubs-diagnostic-log-schema](./includes/event-hubs-diagnostic-log-schema.md)]
-## Runtime audit Logs
+## Runtime audit logs
Runtime audit logs capture aggregated diagnostic information for all data plane access operations (such as send or receive events) in the Event Hubs dedicated cluster. > [!NOTE]
Name | Description
`Protocol` | Type of the protocol associated with the operation. `AuthType` | Type of authentication (Azure Active Directory or SAS Policy). `AuthKey` | Azure Active Directory application ID or SAS policy name that's used to authenticate to a resource.
-`NetworkType` | Type of the network access: `PublicNetworkAccess`, `PrivateNetworkAccess`.
+`NetworkType` | Type of the network access: `Public` or `Private`.
`ClientIP` | IP address of the client application. `Count` | Total number of operations performed during the aggregated period of 1 minute. `Properties` | Metadata that are specific to the data plane operation.
Here's an example of a runtime audit log entry:
```json { "ActivityId": "<activity id>",
- "ActivityName": "ConnectionOpen | Authenticate | SendMessage | ReceiveMessage | GetRuntimeInfo",
+ "ActivityName": "ConnectionOpen | Authorization | SendMessage | ReceiveMessage",
"ResourceId": "/SUBSCRIPTIONS/xxx/RESOURCEGROUPS/<Resource Group Name>/PROVIDERS/MICROSOFT.EVENTHUB/NAMESPACES/<Event Hubs namespace>/eventhubs/<event hub name>", "Time": "1/1/2021 8:40:06 PM +00:00", "Status": "Success | Failure", "Protocol": "AMQP | KAFKA | HTTP | Web Sockets", "AuthType": "SAS | Azure Active Directory",
- "AuthId": "<app name | SAS policy name>",
- "NetworkType": "PublicNetworkAccess | PrivateNetworkAccess",
+ "AuthId": "<AAD application name | SAS policy name>",
+ "NetworkType": "Public | Private",
"ClientIp": "x.x.x.x", "Count": 1,
- "Properties": {
- "key1": "value1",
- "key2": "value2"
- },
"Category": "RuntimeAuditLogs" }
expressroute About Fastpath https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/about-fastpath.md
ExpressRoute virtual network gateway is designed to exchange network routes and
### Circuits
-FastPath is available on all ExpressRoute circuits.
+FastPath is available on all ExpressRoute circuits. Public preview support for Private Link connectivity over FastPath is available for connections associated to ExpressRoute Direct circuits. Connections associated to ExpressRoute partner circuits are not eligible for the preview.
### Gateways
This preview supports connectivity to the following Azure
- Azure Storage - Third Party Private Link Services
+This preview is available for connections associated to ExpressRoute Direct circuits. Connections associated to ExpressRoute partner circuits are not eligible for this preview.
+ > [!NOTE] > Private Link pricing will not apply to traffic sent over ExpressRoute FastPath during Public preview. For more information about pricing, check out the [Private Link pricing page](https://azure.microsoft.com/pricing/details/private-link/). >
governance Guest Configuration Create Group Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/how-to/guest-configuration-create-group-policy.md
- Title: How-to create a guest configuration policy from Group Policy
-description: Learn how to convert Group Policy into a policy definition.
Previously updated : 03/31/2021--
-# How-to create a guest configuration policy from Group Policy
-
-Before you begin, it's a good idea to read the overview page for
-[guest configuration](../concepts/guest-configuration.md),
-and the details about guest configuration policy effects
-[How to configure remediation options for guest configuration](../concepts/guest-configuration-policy-effects.md).
-
-> [!IMPORTANT]
-> Converting Group Policy to guest configuration is **in preview**. Not all types
-> of Group Policy settings have corresponding DSC resources available for
-> PowerShell 7.
->
-> All of the commands on this page must be run in **Windows PowerShell 5.1**.
-> The resulting output MOF files should then be packaged using the
-> `GuestConfiguration` module in PowerShell 7.1.3 or later.
->
-> Custom guest configuration policy definitions using **AuditIfNotExists** are
-> Generally Available, but definitions using **DeployIfNotExists** with guest
-> configuration are **in preview**.
->
-> The guest configuration extension is required for Azure virtual machines. To
-> deploy the extension at scale across all machines, assign the following policy
-> initiative: `Deploy prerequisites to enable guest configuration policies on
-> virtual machines`
->
-> Don't use secrets or confidential information in custom content packages.
-
-The open source community has published the module
-[BaselineManagement](https://github.com/microsoft/BaselineManagement)
-to convert exported
-[Group Policy](/support/windows-server/group-policy/group-policy-overview)
-templates to PowerShell DSC format. Together with the `GuestConfiguration`
-module, you can create a guest configuration package for Windows
-from exported Group Policy Objects. The guest configuration package can then
-be used to audit or configure servers using local policy, even if they aren't
-domain joined.
-
-In this guide, we walk through the process to create an Azure Policy guest
-configuration package from a Group Policy Object (GPO).
-
-## Download required PowerShell modules
-
-To install all required modules in PowerShell:
-
-```powershell
-Install-Module guestconfiguration
-Install-Module baselinemanagement
-```
-
-To backup Group Policy Objects (GPOs) from an Active Directory environment,
-you need the PowerShell commands available in the Remote Server Administration
-Toolkit (RSAT).
-
-To enable RSAT for Group Policy Management Console on Windows 10:
-
-```powerShell
-Add-WindowsCapability -Online -Name 'Rsat.GroupPolicy.Management.Tools~~~~0.0.1.0'
-Add-WindowsCapability -Online -Name 'Rsat.ActiveDirectory.DS-LDS.Tools~~~~0.0.1.0'
-```
-
-## Export and convert Group Policy to guest configuration
-
-There are three options to export Group Policy files and convert them to DSC to
-use in guest configuration.
--- Export a single Group Policy Object-- Export the merged Group Policy Objects for an OU-- Export the merged Group Policy Objects from within a machine-
-### Single Group Policy Object
-
-Identify the GUID of the Group Policy Object to export by using the commands in
-the `Group Policy` module. In a large environment, consider piping the output
-to `where-object` and filtering by name.
-
-Run each of the following in a **Windows PowerShell 5.1** environment on a
-**domain joined** Windows machine:
-
-```powershell
-# List all Group Policy Objects
-Get-GPO -all
-```
-
-Backup the Group Policy to files. The command also accepts a "Name" parameter,
-but using the GUID of the policy is less error prone.
-
-```powershell
-Backup-GPO -Guid 'f0cf623e-ae29-4768-9bb4-406cce1f3cff' -Path C:\gpobackup\
-```
-
-```
-
-The output of the command returns the details of the files.
-
-ConfigurationScript Configuration Name
-- - -
-C:\convertfromgpo\myCustomPolicy1.ps1 C:\convertfromgpo\localhost.mof myCustomPolicy1
-```
-
-Review the exported PowerShell script to make sure all settings have been
-populated and no error messages were written. Create a new configuration package
-using the MOF file by following the guidance in page
-[How to create custom guest configuration package artifacts](./guest-configuration-create.md).
-The steps to create and test the guest configuration package should be run in
-a PowerShell 7 environment.
-
-### Merged Group Policy Objects for an OU
-
-Export the merged combination of Group Policy Objects (similar to a resultant
-set of policy) at a specified Organizational Unit. The merge operation takes in
-to account link state, enforcement, and access, but not WMI filters.
-
-```powershell
-Merge-GPOsFromOU -Path C:\mergedfromou\ -OUDistinguishedName 'OU=mySubOU,OU=myOU,DC=mydomain,DC=local' -OutputConfigurationScript
-```
-
-The output of the command returns the details of the files.
-
-```powershell
-Configuration Name ConfigurationScript
-- - -
-C:\mergedfromou\mySubOU\output\localhost.mof mySubOU C:\mergedfromou\mySubOU\output\mySubOU.ps1
-```
-
-### Merged Group Policy Objects from within a machine
-
-You can also merge the policies applied to a specific machine, by running the
-`Merge-GPOs` command from Windows PowerShell. WMI Filters are only evaluated
-if you merge from within a machine.
-
-```powershell
-Merge-GPOs -OutputConfigurationScript -Path c:\mergedgpo
-```
-
-The output of the command will return the details of the files.
-
-```powershell
-Configuration Name ConfigurationScript PolicyDetails
-- - - -
-C:\mergedgpo\localhost.mof MergedGroupPolicy_ws1 C:\mergedgpo\MergedGroupPolicy_ws1.ps1 {@{Name=myEnforcedPolicy; Ap...
-```
-
-## OPTIONAL: Download sample Group Policy files for testing
-
-If you aren't ready to export Group Policy files from an Active Directory environment, you can
-download Windows Server security baseline from the Windows Security and Compliant Toolkit.
-
-Create a directory for and download the Windows Server 2019 Security Baseline from the Windows
-Security Compliance toolkit.
-
-```azurepowershell-interactive
-# Download the 2019 Baseline files from https://docs.microsoft.com/windows/security/threat-protection/security-compliance-toolkit-10
-New-Item -Path 'C:\git\policyfiles\downloads' -Type Directory
-Invoke-WebRequest -Uri 'https://download.microsoft.com/download/8/5/C/85C25433-A1B0-4FFA-9429-7E023E7DA8D8/Windows%2010%20Version%201909%20and%20Windows%20Server%20Version%201909%20Security%20Baseline.zip' -Out C:\git\policyfiles\downloads\Server2019Baseline.zip
-```
-
-Unblock and expand the downloaded Server 2019 Baseline.
-
-```azurepowershell-interactive
-Unblock-File C:\git\policyfiles\downloads\Server2019Baseline.zip
-Expand-Archive -Path C:\git\policyfiles\downloads\Server2019Baseline.zip -DestinationPath C:\git\policyfiles\downloads\
-```
-
-Validate the Server 2019 Baseline contents using **MapGuidsToGpoNames.ps1**.
-
-```azurepowershell-interactive
-# Show content details of downloaded GPOs
-C:\git\policyfiles\downloads\Scripts\Tools\MapGuidsToGpoNames.ps1 -rootdir C:\git\policyfiles\downloads\GPOs\ -Verbose
-```
-
-## Next steps
--- [Create a package artifact](./guest-configuration-create.md)
- for guest configuration.
-- [Test the package artifact](./guest-configuration-create-test.md)
- from your development environment.
-- [Publish the package artifact](./guest-configuration-create-publish.md)
- so it is accessible to your machines.
-- Use the `GuestConfiguration` module to
- [create an Azure Policy definition](./guest-configuration-create-definition.md)
- for at-scale management of your environment.
-- [Assign your custom policy definition](../assign-policy-portal.md) using
- Azure portal.
-- Learn how to view
- [compliance details for guest configuration](./determine-non-compliance.md#compliance-details-for-guest-configuration) policy assignments.
healthcare-apis Deploy Healthcare Apis Using Bicep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/deploy-healthcare-apis-using-bicep.md
+
+ Title: How to create Healthcare APIs, workspaces, FHIR and DICOM service, and IoT connectors using Azure Bicep
+description: This document describes how to deploy Healthcare APIs using Azure Bicep.
++++ Last updated : 01/31/2022++++
+# Deploy Healthcare APIs Using Azure Bicep
+
+In this article, you'll learn how to create Healthcare APIs, including workspaces, FHIR services, DICOM services, and IoT connectors using Azure Bicep. You can view and download the Bicep scripts used in this article in [HealthcareAPIs samples](https://github.com/microsoft/healthcare-apis-samples/blob/main/src/templates/healthcareapis.bicep).
+
+## What is Azure Bicep
+
+Bicep is built on top of Azure Resource Manager (ARM) template. Bicep immediately supports all preview and generally available (GA) versions for Azure services, including Healthcare APIs. During development, you can generate a JSON ARM template file using the `az bicep build` command. Conversely, you can decompile the JSON files to Bicep using the `az bicep decompile` command. During deployment, the Bicep CLI converts a Bicep file into an ARM template JSON.
+
+You can continue to work with JSON ARM templates, or use Bicep to develop your ARM templates. For more information on Bicep, see [What is Bicep](../azure-resource-manager/bicep/overview.md).
+
+>[!Note]
+>The templates and scripts in the article are tested in Visual Studio Code during the public preview. Some changes may be necessary to adapt the code to run in your environment.
+
+## Define parameters and variables
+
+Using Bicep parameters and variables instead of hard coding names and other values allows you to debug and reuse your Bicep templates.
+
+We first define parameters with the keyword *param* for workspace, FHIR service, DICOM service, IoT connector. Also, we define parameters for Azure subscription and Azure Active Directory (Azure AD) tenant. TheyΓÇÖre used in the CLI command line with the "--parameters" option.
+
+We then define variables for resources with the keyword *var*. Also, we define variables for properties such as the authority and the audience for the FHIR service. TheyΓÇÖre specified and used internally in the Bicep template, and can be used in combination of parameters, Bicep functions, and other variables. Unlike parameters, they arenΓÇÖt used in the CLI command line.
+
+It's important to note that one Bicep function and environment(s) are required to specify the log in URL, `https://login.microsoftonline.com`. For more information on Bicep functions, see [Deployment functions for Bicep](../azure-resource-manager/bicep/bicep-functions-deployment.md#environment).
+
+```
+param workspaceName string
+param fhirName string
+param dicomName string
+param iotName string
+param tenantId string
+
+var fhirservicename = '${workspaceName}/${fhirName}'
+var dicomservicename = '${workspaceName}/${dicomName}'
+var iotconnectorname = '${workspaceName}/${iotName}'
+var iotdestinationname = '${iotconnectorname}/output1'
+var loginURL = environment().authentication.loginEndpoint
+var authority = '${loginURL}${tenantId}'
+var audience = 'https://${workspaceName}-${fhirName}.fhir.azurehealthcareapis.com'
+```
+
+## Create a workspace template
+
+To define a resource, use the keyword *resource*. For the workspace resource, the required properties include the workspace name and location. In the template, the location of the resource group is used, but you can specify a different value for the location. For the resource name, you can reference the defined parameter or variable.
+
+For more information on resource and module, see [Resource declaration in Bicep](../azure-resource-manager/bicep/resource-declaration.md).
+
+```
+//Create a workspace
+resource exampleWorkspace 'Microsoft.HealthcareApis/workspaces@2021-06-01-preview' = {
+ name: workspaceName
+ location: resourceGroup().location
+}
+```
+
+To use or reference an existing workspace without creating one, use the keyword *existing*. Specify the workspace resource name, and the existing workspace instance name for the name property. Note that a different name for the existing workspace resource is used in the template, but that is not a requirement.
+
+```
+//Use an existing workspace
+resource exampleExistingWorkspace 'Microsoft.HealthcareApis/workspaces@2021-06-01-preview' existing = {
+ name: workspaceName
+}
+```
+
+You're now ready to deploy the workspace resource using the `az deployment group create` command. You can also deploy it along with its other resources, as described further later in this article.
+
+## Create a FHIR service template
+
+For the FHIR service resource, the required properties include service instance name, location, kind, and managed identity. Also, it has a dependency on the workspace resource. For the FHIR service itself, the required properties include authority and audience, which are specified in the properties element.
+
+```
+resource exampleFHIR 'Microsoft.HealthcareApis/workspaces/fhirservices@2021-06-01-preview' = {
+ name: fhirservicename
+ location: resourceGroup().location
+ kind: 'fhir-R4'
+ identity: {
+ type: 'SystemAssigned'
+ }
+ dependsOn: [
+ exampleWorkspace
+ //exampleExistingWorkspace
+ ]
+ properties: {
+ accessPolicies: []
+ authenticationConfiguration: {
+ authority: authority
+ audience: audience
+ smartProxyEnabled: false
+ }
+ }
+}
+```
+
+Similarly, you can use or reference an existing FHIR service using the keyword *existing*.
+
+```
+//Use an existing FHIR service
+resource exampleExistingFHIR 'Microsoft.HealthcareApis/workspaces/fhirservices@2021-06-01-preview' existing = {
+ name: fhirservicename
+}
+```
+
+## Create a DICOM service template
+
+For the DICOM service resource, the required properties include service instance name and location, and the dependency on the workspace resource type.
+
+```
+//Create DICOM service
+resource exampleDICOM 'Microsoft.HealthcareApis/workspaces/dicomservices@2021-06-01-preview' = {
+ name: dicomservicename
+ location: resourceGroup().location
+ dependsOn: [
+ exampleWorkspace
+ ]
+ properties: {}
+}
+```
+
+Similarly, you can use or reference an existing DICOM service using the keyword *existing*.
+
+```
+//Use an existing DICOM service
+ resource exampleExistingDICOM 'Microsoft.HealthcareApis/workspaces/dicomservices@2021-06-01-preview' existing = {
+ name: dicomservicename
+}
+```
+
+## Create an IoT connector template
+
+For the IoT connector resource, the required properties include IoT connector name, location, managed identity, and the dependency on the workspace. For the IoT connector itself, required properties include Azure Event Hubs namespace, Event Hubs, Event Hubs consumer group, and the device mapping. As an example, the heart rate device mapping is used in the template.
+
+```
+//Create IoT connector
+resource exampleIoT 'Microsoft.HealthcareApis/workspaces/iotconnectors@2021-06-01-preview' = {
+ name: iotconnectorname
+ location: resourceGroup().location
+ identity: {
+ type: 'SystemAssigned'
+ }
+ dependsOn: [
+ exampleWorkspace
+ //exampleExistingWorkspace
+ ]
+ properties: {
+ ingestionEndpointConfiguration: {
+ eventHubName: 'eventhubnamexxx'
+ consumerGroup: 'eventhubconsumergroupxxx'
+ fullyQualifiedEventHubNamespace: 'eventhubnamespacexxx.servicebus.windows.net'
+ }
+ deviceMapping: {
+ content: {
+ templateType: 'CollectionContent'
+ template: [
+ {
+ templateType: 'JsonPathContent'
+ template: {
+ typeName: 'heartrate'
+ typeMatchExpression: '$..[?(@heartrate)]'
+ deviceIdExpression: '$.deviceid'
+ timestampExpression: '$.measurementdatetime'
+ values: [
+ {
+ required: 'true'
+ valueExpression: '$.heartrate'
+ valueName: 'Heart rate'
+ }
+ ]
+ }
+ }
+ ]
+ }
+ }
+ }
+ }
+```
+
+Similarly, you can use, or reference an existing IoT connector using the keyword *existing*.
+
+```
+//Use an existing IoT
+resource exampleExistingIoT 'Microsoft.HealthcareApis/workspaces/iotconnectors/fhirdestinations@2021-06-01-preview' existing = {
+ name: iotconnectorname
+}
+```
+
+The IoT connector requires a child resource, destination, and it currently supports the FHIR service destination only. For the IoT connector destination resource, the required properties include a name, location, and the dependency on the IoT connector. For the FHIR service destination, required properties include the resolution type, which it takes a value of *Create* or *Lookup*, the FHIR service resource ID, and a FHIR resource type. As an example, the heart rate mapping for the FHIR Observation resource is used in the template.
+
+```
+//Create IoT destination
+resource exampleIoTDestination 'Microsoft.HealthcareApis/workspaces/iotconnectors/fhirdestinations@2021-06-01-preview' = {
+ name: iotdestinationname
+ location: resourceGroup().location
+ dependsOn: [
+ exampleIoT
+ //exampleExistingIoT
+ ]
+ properties: {
+ resourceIdentityResolutionType: 'Create'
+ fhirServiceResourceId: exampleFHIR.id //exampleExistingFHIR.id
+ fhirMapping: {
+ content: {
+ templateType: 'CollectionFhirTemplate'
+ template: [
+ {
+ templateType: 'CodeValueFhir'
+ template: {
+ codes: [
+ {
+ code: '8867-4'
+ system: 'http://loinc.org'
+ display: 'Heart rate'
+ }
+ ]
+ periodInterval: 60
+ typeName: 'heartrate'
+ value: {
+ defaultPeriod: 5000
+ unit: 'count/min'
+ valueName: 'hr'
+ valueType: 'SampledData'
+ }
+ }
+ }
+ ]
+ }
+ }
+ }
+}
+```
+
+## Deploy Healthcare APIs
+
+You can use the `az deployment group create` command to deploy individual Bicep template or combined templates, similar to the way you deploy Azure resources with JSON templates. Specify the resource group name, and include the parameters in the command line. With the "--parameters" option, specify the parameter and value pair as "parameter = value", and separate the parameter and value pairs by a space if more than one parameter is defined.
+
+For the Azure subscription and tenant, you can specify the values, or use CLI commands to obtain them from the current sign-in session.
+
+```
+resourcegroupname=xxx
+location=e.g. eastus2
+workspacename=xxx
+fhirname=xxx
+dicomname=xxx
+iotname=xxx
+bicepfilename=xxx.bicep
+#tenantid=xxx
+#subscriptionid=xxx
+subscriptionid=$(az account show --query id --output tsv)
+tenantid=$(az account show --subscription $subscriptionid --query tenantId --output tsv)
+
+az deployment group create --resource-group $resourcegroupname --template-file $bicepfilename --parameters workspaceName=$workspacename fhirName=$fhirname dicomName=$dicomname iotName=$iotname tenantId=$tenantid
+```
+
+Note that the child resource name such as the FHIR service includes the parent resource name, and the "dependsOn" property is required. However, when the child resource is created within the parent resource, its name does not need to include the parent resource name, and the "dependsOn" property is not required. For more info on nested resources, see [Set name and type for child resources in Bicep](../azure-resource-manager/bicep/child-resource-name-type.md).
+
+## Debugging Bicep templates
+
+You can debug Bicep templates in Visual Studio Code, or in other environments and troubleshoot issues based on the response. Also, you can review the activity log for a specific resource in the resource group while debugging.
+
+In addition, you can use the **output** value for debugging or as part of the deployment response. For example, you can define two output values to display the values of authority and audience for the FHIR service in the response. For more information, see [Outputs in Bicep](../azure-resource-manager/bicep/outputs.md).
+
+```
+output stringOutput1 string = authority
+output stringOutput2 string = audience
+```
+
+## Next steps
+
+In this article, you learned how to create Healthcare APIs, including workspaces, FHIR services, DICOM services, and IoT connectors using Bicep. You also learned how to create and debug Bicep templates. For more information about Healthcare APIs, see
+
+>[!div class="nextstepaction"]
+>[What is Azure Healthcare APIs](healthcare-apis-overview.md)
iot-central Howto Export Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/howto-export-data.md
description: How to use the new data export to export your IoT data to Azure and
Previously updated : 10/20/2021 Last updated : 01/31/2022
IoT Central exports data in near real time to a database table in the Azure Data
To query the exported data in the Azure Data Explorer portal, navigate to the database and select **Query**.
+### Connection options
+
+Azure Data Explorer destinations let you configure the connection with a *service principal* or a [managed identity](../../active-directory/managed-identities-azure-resources/overview.md).
+
+Managed identities are more secure because:
+
+- You don't store the credentials for your resource in your IoT Central application.
+- The credentials are automatically tied to the lifetime of your IoT Central application.
+- Managed identities automatically rotate their security keys regularly.
+
+IoT Central currently uses [system-assigned managed identities](../../active-directory/managed-identities-azure-resources/overview.md#managed-identity-types).
+
+When you configure a managed identity, the configuration includes a *scope* and a *role*:
+
+- The scope defines where you can use the managed identity.
+- The role defines what permissions the IoT Central application is granted in the destination service.
+
+This article shows how to create a managed identity using the Azure CLI. You can also use the Azure portal to create a manged identity.
+
+# [Webhook](#tab/webhook)
+
+For webhook destinations, IoT Central exports data in near real time. The data in the message body is in the same format as for Event Hubs and Service Bus.
+
+### Create a webhook destination
+
+You can export data to a publicly available HTTP webhook endpoint. You can create a test webhook endpoint using [RequestBin](https://requestbin.net/). RequestBin throttles request when the request limit is reached:
+
+1. Open [RequestBin](https://requestbin.net/).
+1. Create a new RequestBin and copy the **Bin URL**. You use this URL when you test your data export.
+
+To create the Azure Data Explorer destination in IoT Central on the **Create new destination** page:
+
+1. Enter a **Destination name**.
+
+1. Select **Webhook** as the destination type.
+
+1. Paste the callback URL for your webhook endpoint. You can optionally configure webhook authorization and add custom headers.
+
+ - For **OAuth2.0**, only the client credentials flow is supported. When you save the destination, IoT Central communicates with your OAuth provider to retrieve an authorization token. This token is attached to the `Authorization` header for every message sent to this destination.
+ - For **Authorization token**, you can specify a token value that's directly attached to the `Authorization` header for every message sent to this destination.
+
+1. Select **Save**.
+++
+# [Service principal](#tab/service-principal/data-explorer)
+ ### Create an Azure Data Explorer destination If you don't have an existing Azure Data Explorer database to export to, follow these steps:
To create the Azure Data Explorer destination in IoT Central on the **Create new
:::image type="content" source="media/howto-export-data/export-destination.png" alt-text="Screenshot of Azure Data Explorer export destination.":::
-# [Webhook](#tab/webhook)
+# [Managed identity](#tab/managed-identity/data-explorer)
-For webhook destinations, IoT Central exports data in near real time. The data in the message body is in the same format as for Event Hubs and Service Bus.
+### Create an Azure Data Explorer destination
-### Create a webhook destination
+If you don't have an existing Azure Data Explorer database to export to, follow these steps. You have two choices to create an Azure Data Explorer database:
-You can export data to a publicly available HTTP webhook endpoint. You can create a test webhook endpoint using [RequestBin](https://requestbin.net/). RequestBin throttles request when the request limit is reached:
+- Create a new Azure Data Explorer cluster and database. To learn more, see the [Azure Data Explorer quickstart](/azure/data-explorer/create-cluster-database-portal). Make a note of the cluster URI and the name of the database you create, you need these values in the following steps.
+- Create a new Azure Synapse Data Explorer pool and database. To learn more, see the [Azure Data Explorer quickstart](../../synapse-analytics/get-started-analyze-data-explorer.md). Make a note of the pool URI and the name of the database you create, you need these values in the following steps.
-1. Open [RequestBin](https://requestbin.net/).
-1. Create a new RequestBin and copy the **Bin URL**. You use this URL when you test your data export.
+To configure the managed identity that enables your IoT Central application to securely export data to your Azure resource:
+
+1. Create a managed identity for your IoT Central application to use to connect to your database. Use the Azure Cloud Shell to run the following command:
+
+ ```azurecli
+ az iot central app identity assign --name {your IoT Central app name} \
+ --resource-group {resource group name} \
+ --system-assigned
+ ```
+
+ Make a note of the `principalId` and `tenantId` output by the command. You use these values in the following step.
+
+1. Configure the database permissions to allow connections from your IoT Central application. Use the Azure Cloud Shell to run the following command:
+
+ ```azurecli
+ az kusto database-principal-assignment create --cluster-name {name of your cluster} \
+ --database-name {name of your database} \
+ --resource-group {resource group name} \
+ --principal-assignment-name {name of your IoT Central application} \
+ --principal-id {principal id from the previous step} \
+ --principal-type App --role Admin \
+ --tenant-id {tenant id from the previous step}
+ ```
+
+ > [!TIP]
+ > If you're using Azure Synapse, see [`az synapse kusto database-principal-assignment`](/cli/azure/synapse/kusto/database-principal-assignment).
+
+1. Create a table in your database with a suitable schema for the data you're exporting. The following example query creates a table called `smartvitalspatch`. To learn more, see [Transform data inside your IoT Central application for export](howto-transform-data-internally.md):
+
+ ```kusto
+ .create table smartvitalspatch (
+ EnqueuedTime:datetime,
+ Message:string,
+ Application:string,
+ Device:string,
+ Simulated:boolean,
+ Template:string,
+ Module:string,
+ Component:string,
+ Capability:string,
+ Value:dynamic
+ )
+ ```
+
+1. (Optional) To speed up ingesting data into your Azure Data Explorer database:
+
+ 1. Navigate to the **Configurations** page for your Azure Data Explorer cluster. Then enable the **Streaming ingestion** option.
+ 1. Run the following query to alter the table policy to enable streaming ingestion:
+
+ ```kusto
+ .alter table smartvitalspatch policy streamingingestion enable
+ ```
To create the Azure Data Explorer destination in IoT Central on the **Create new destination** page: 1. Enter a **Destination name**.
-1. Select **Webhook** as the destination type.
+1. Select **Azure Data Explorer** as the destination type.
-1. Paste the callback URL for your webhook endpoint. You can optionally configure webhook authorization and add custom headers.
+1. Enter your Azure Data Explorer cluster or pool URL, database name, and table name. Select **System-assigned managed identity** as the authorization type.
- - For **OAuth2.0**, only the client credentials flow is supported. When you save the destination, IoT Central communicates with your OAuth provider to retrieve an authorization token. This token is attached to the `Authorization` header for every message sent to this destination.
- - For **Authorization token**, you can specify a token value that's directly attached to the `Authorization` header for every message sent to this destination.
-
-1. Select **Save**.
+ > [!TIP]
+ > The cluster URL for a standalone Azure Data Explorer looks like `https://<ClusterName>.<AzureRegion>.kusto.windows.net`. The cluster URL for an Azure Synapse Data Explorer pool looks like `https://<DataExplorerPoolName>.<SynapseWorkspaceName>.kusto.azuresynapse.net`.
-
+ :::image type="content" source="media/howto-export-data/export-destination-managed.png" alt-text="Screenshot of Azure Data Explorer export destination.":::
# [Connection string](#tab/connection-string/event-hubs)
If you don't have an existing Event Hubs namespace to export to, follow these st
- Copy either the primary or secondary connection string. You use this connection string to set up a new destination in IoT Central. - Alternatively, you can generate a connection string for the entire Event Hubs namespace: 1. Go to your Event Hubs namespace in the Azure portal.
- 2. Under **Settings**, select **Shared Access Policies**
+ 2. Under **Settings**, select **Shared Access Policies**.
3. Create a new key or choose an existing key that has **Send** permissions.
- 4. Copy either the primary or secondary connection string
+ 4. Copy either the primary or secondary connection string.
To create the Event Hubs destination in IoT Central on the **Create new destination** page:
If you don't have an existing Service Bus namespace to export to, follow these s
- Copy either the primary or secondary connection string. You use this connection string to set up a new destination in IoT Central. - Alternatively, you can generate a connection string for the entire Service Bus namespace: 1. Go to your Service Bus namespace in the Azure portal.
- 2. Under **Settings**, select **Shared Access Policies**
+ 2. Under **Settings**, select **Shared Access Policies**.
3. Create a new key or choose an existing key that has **Send** permissions.
- 4. Copy either the primary or secondary connection string
+ 4. Copy either the primary or secondary connection string.
To create the Service Bus destination in IoT Central on the **Create new destination** page:
iot-develop Quickstart Devkit Mxchip Az3166 Iot Hub https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-develop/quickstart-devkit-mxchip-az3166-iot-hub.md
Last updated 06/09/2021
-# Quickstart: Connect an MXCHIP AZ3166 devkit to IoT Hub
+# Connect an MXCHIP AZ3166 devkit to IoT Hub
**Applies to**: [Embedded device development](about-iot-develop.md#embedded-device-development)<br> **Total completion time**: 30 minutes
iot-develop Quickstart Devkit Stm B L475e Freertos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-develop/quickstart-devkit-stm-b-l475e-freertos.md
+
+ Title: Connect an STMicroelectronics B-L475E to Azure IoT Central quickstart
+description: Use Azure FreeRTOS device middleware to connect an STMicroelectronics B-L475E-IOT01A Discovery kit to Azure IoT and send telemetry.
+++
+ms.devlang: c
+ Last updated : 01/27/2022+
+#Customer intent: As a device builder, I want to see a working IoT device sample connecting to Azure IoT, sending properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
++
+# Quickstart: Connect an STMicroelectronics B-L475E-IOT01A Discovery kit to Azure IoT Central
+
+**Applies to**: [Embedded device development](about-iot-develop.md#embedded-device-development)<br>
+**Total completion time**: 30 minutes
+
+In this quickstart, you use the Azure FreeRTOS middleware to connect the STMicroelectronics B-L475E-IOT01A Discovery kit (from now on, the STM DevKit) to Azure IoT Central.
+
+You complete the following tasks:
+
+* Install a set of embedded development tools to program an STM DevKit
+* Build an image and flash it onto the STM DevKit
+* Use Azure IoT Central to create cloud components, view properties, view device telemetry, and call direct commands
+
+## Prerequisites
+
+Operating system: Windows 10 or Windows 11
+
+Hardware:
+- STM [B-L475E-IOT01A](https://www.st.com/en/evaluation-tools/b-l475e-iot01a.html) devkit
+- USB 2.0 A male to Micro USB male cable
+- Wi-Fi 2.4 GHz
+
+## Prepare the development environment
+
+To set up your development environment, first you clone a GitHub repo that contains all the assets you need for the tutorial. Then you install a set of programming tools.
+
+### Clone the repo
+
+Clone the following repo to download all sample device code, setup scripts, and offline versions of the documentation. If you previously cloned this repo in another tutorial, you don't have to do it again.
+
+To clone the repo, run the following command:
+
+```shell
+git clone --recursive https://github.com/Azure-Samples/iot-middleware-freertos-samples
+```
+
+### Install Ninja
+
+Ninja is a build tool that you use to build an image for the STM DevKit.
+
+1. Download [Ninja](https://github.com/ninja-build/ninja/releases) and unzip it to your local disk.
+1. Add the path to the Ninja executable to a PATH environment variable.
+1. Open a new console to recognize the update, and confirm that the Ninja binary is available in the `PATH` environment variable:
+ ```shell
+ ninja --version
+ ```
+
+### Install the tools
+
+The cloned repo contains a setup script that installs and configures the required tools. If you installed these tools in another tutorial in the getting started guide, you don't have to do it again.
+
+> Note: The setup script installs the following tools:
+> * [CMake](https://cmake.org): Build
+> * [ARM GCC](https://developer.arm.com/tools-and-software/open-source-software/developer-tools/gnu-toolchain/gnu-rm): Compile
+> * [Termite](https://www.compuphase.com/software_termite.htm): Monitor serial port output for connected devices
+
+To install the tools:
+
+1. From File Explorer, navigate to the following path in the repo and run the setup script named *get-toolchain.bat*:
+
+ > *iot-middleware-freertos-samples\tools\get-toolchain.bat*
+
+1. After the installation, open a new console window to recognize the configuration changes made by the setup script. Use this console to complete the remaining programming tasks in the tutorial. You can use Windows CMD, PowerShell, or Git Bash for Windows.
+1. Run the following code to confirm that CMake version **3.20** or later is installed.
+
+ ```shell
+ cmake --version
+ ```
++
+## Prepare the device
+To connect the STM DevKit to Azure, modify configuration settings, build the image, and flash the image to the device.
+
+### Add configuration
+
+1. Open the following file in a text editor:
+
+ *iot-middleware-freertos-samples/demos/projects/ST/b-l475e-iot01a/config/demo_config.h*
+
+1. Set the Wi-Fi constants to the following values from your local environment.
+
+ |Constant name|Value|
+ |-|--|
+ |`WIFI_SSID` |{*Your Wi-Fi ssid*}|
+ |`WIFI_PASSWORD` |{*Your Wi-Fi password*}|
+ |`WIFI_SECURITY_TYPE` |{*One of the enumerated Wi-Fi mode values in the file*}|
+
+1. Set the Azure IoT device information constants to the values that you saved after you created Azure resources.
+
+ |Constant name|Value|
+ |-|--|
+ |`democonfigID_SCOPE` |{*Your ID scope value*}|
+ |`democonfigREGISTRATION_ID` |{*Your Device ID value*}|
+ |`democonfigDEVICE_SYMMETRIC_KEY` |{*Your Primary key value*}|
+
+1. Save and close the file.
+
+### Build the image
+
+1. In your console, run the following commands from the *iot-middleware-freertos-samples* directory to build the device image:
+
+ ```shell
+ cmake -G Ninja -DVENDOR=ST -DBOARD=b-l475e-iot01a -Bb-l475e-iot01a .
+ cmake --build b-l475e-iot01a
+ ```
+
+2. After the build completes, confirm that the binary file was created in the following path:
+
+ *iot-middleware-freertos-samples\b-l475e-iot01a\demos\projects\ST\b-l475e-iot01a\iot-middleware-sample-gsg.bin*
+
+### Flash the image
+
+1. On the STM DevKit board, locate the **Reset** button (1), the Micro USB port (2), which is labeled **USB STLink**, and the board part number (3). You will refer to these items in the next steps. All of them are highlighted in the following picture:
+
+ :::image type="content" source="media/quickstart-devkit-stm-b-l475e-freertos/stm-devkit-board-475.png" alt-text="Locate key components on the STM DevKit board":::
+
+1. Connect the Micro USB cable to the **USB STLINK** port on the STM DevKit, and then connect it to your computer.
+
+ > [!NOTE]
+ > For detailed setup information about the STM DevKit, see the instructions on the packaging, or see [B-L475E-IOT01A Resources](https://www.st.com/en/evaluation-tools/b-l475e-iot01a.html#resource)
+
+1. In File Explorer, find the binary file named *iot-middleware-sample-gsg.bin* that you created previously.
+
+1. In File Explorer, find the STM Devkit board that's connected to your computer. The device appears as a drive on your system with the drive label **DIS_L4IOT**.
+
+1. Paste the binary file into the root folder of the STM Devkit. The process to flash the board starts automatically and completes in a few seconds.
+
+ > [!NOTE]
+ > During the process, an LED toggles between red and green on the STM DevKit.
+
+### Confirm device connection details
+
+You can use the **Termite** app to monitor communication and confirm that your device is set up correctly.
+
+1. Start **Termite**.
+ > [!TIP]
+ > If you are unable to connect Termite to your devkit, install the [ST-LINK driver](https://www.st.com/en/development-tools/stsw-link009.html) and try again. See [Troubleshooting](troubleshoot-embedded-device-quickstarts.md) for additional steps.
+1. Select **Settings**.
+1. In the **Serial port settings** dialog, check the following settings and update if needed:
+ * **Baud rate**: 115,200
+ * **Port**: The port that your STM DevKit is connected to. If there are multiple port options in the dropdown, you can find the correct port to use. Open Windows **Device Manager**, and view **Ports** to identify which port to use.
+
+ :::image type="content" source="media/quickstart-devkit-stm-b-l475e/termite-settings.png" alt-text="Screenshot of serial port settings in the Termite app":::
+
+1. Select OK.
+1. Press the **Reset** button on the device. The button is black and is labeled on the device.
+1. In the **Termite** app, check the output to confirm that the device is initialized and connected to Azure IoT. After some initial connection details, you should begin to see your board sensors sending telemetry to Azure IoT.
+
+ ```output
+ Successfully sent telemetry message
+ [INFO] [MQTT] [receivePacket:885] Packet received. ReceivedBytes=2.
+ [INFO] [MQTT] [handlePublishAcks:1161] Ack packet deserialized with result: MQTTSuccess.
+ [INFO] [MQTT] [handlePublishAcks:1174] State record updated. New state=MQTTPublishDone.
+ Puback received for packet id: 0x00000003
+ [INFO] [AzureIoTDemo] [ulCreateTelemetry:197] Telemetry message sent {"magnetometerX":-204,"magnetometerY":-215,"magnetometerZ":-875}
+
+ Successfully sent telemetry message
+ [INFO] [MQTT] [receivePacket:885] Packet received. ReceivedBytes=2.
+ [INFO] [MQTT] [handlePublishAcks:1161] Ack packet deserialized with result: MQTTSuccess.
+ [INFO] [MQTT] [handlePublishAcks:1174] State record updated. New state=MQTTPublishDone.
+ Puback received for packet id: 0x00000004
+ [INFO] [AzureIoTDemo] [ulCreateTelemetry:197] Telemetry message sent {"accelerometerX":22,"accelerometerY":4,"accelerometerZ":1005}
+
+ Successfully sent telemetry message
+ [INFO] [MQTT] [receivePacket:885] Packet received. ReceivedBytes=2.
+ [INFO] [MQTT] [handlePublishAcks:1161] Ack packet deserialized with result: MQTTSuccess.
+ [INFO] [MQTT] [handlePublishAcks:1174] State record updated. New state=MQTTPublishDone.
+ Puback received for packet id: 0x00000005
+ [INFO] [AzureIoTDemo] [ulCreateTelemetry:197] Telemetry message sent {"gyroscopeX":0,"gyroscopeY":-700,"gyroscopeZ":350}
+ ```
+
+ > [!IMPORTANT]
+ > If the DNS client initialization fails and notifies you that the Wi-Fi firmware is out of date, you'll need to update the Wi-Fi module firmware. Download and install the [Inventek ISM 43362 Wi-Fi module firmware update](https://www.st.com/resource/en/utilities/inventek_fw_updater.zip). Then press the **Reset** button on the device to recheck your connection, and continue with this quickstart.
+
+Keep Termite open to monitor device output in the remaining steps.
+
+## Verify the device status
+
+To view the device status in the IoT Central portal:
+1. From the application dashboard, select **Devices** on the side navigation menu.
+1. Confirm that the **Device status** of the device is updated to **Provisioned**.
+1. Confirm that the **Device template** of the device has been updated to **STM L475 FreeRTOS Getting Started Guide.**
+
+ :::image type="content" source="media/quickstart-devkit-stm-b-l475e-freertos/iot-central-device-view-status.png" alt-text="Screenshot of device status in IoT Central":::
+
+## View telemetry
+
+In IoT Central, you can view the flow of telemetry from your device to the cloud.
+
+To view telemetry in IoT Central:
+1. From the application dashboard, select **Devices** on the side navigation menu.
+1. Select the device from the device list.
+1. Select the **Overview** tab on the device page, and view the telemetry as the device sends messages to the cloud.
+
+ :::image type="content" source="media/quickstart-devkit-stm-b-l475e-freertos/iot-central-device-telemetry.png" alt-text="Screenshot of device telemetry in IoT Central":::
+
+## Call a command on the device
+
+You can also use IoT Central to call a command that you have implemented on your device. In this section, you call a method that enables you to turn an LED on or off.
+
+To call a command in IoT Central portal:
+
+1. Select the **Command** tab from the device page.
+1. Set the **State** dropdown value to *True*, and then select **Run**. The LED light should turn on.
+
+ :::image type="content" source="media/quickstart-devkit-stm-b-l475e-freertos/iot-central-invoke-method.png" alt-text="Screenshot of calling a direct method on a device in IoT Central":::
+
+1. Set the **State** dropdown value to *False*, and then select **Run**. The LED light should turn off.
+
+## View device information
+
+You can view the device information from IoT Central.
+
+Select **About** tab from the device page.
++
+## Troubleshoot and debug
+
+If you experience issues when you build the device code, flash the device, or connect, see [Troubleshooting](troubleshoot-embedded-device-quickstarts.md).
+
+To debug the application, see [Debugging with Visual Studio Code](https://github.com/azure-rtos/getting-started/blob/master/docs/debugging.md).
+
+## Clean up resources
+
+If you no longer need the Azure resources created in this tutorial, you can delete them from the IoT Central portal. Optionally, if you continue to another article in this Getting Started content, you can keep the resources you've already created and reuse them.
+
+To keep the Azure IoT Central sample application but remove only specific devices:
+
+1. Select the **Devices** tab for your application.
+1. Select the device from the device list.
+1. Select **Delete**.
+
+To remove the entire Azure IoT Central sample application and all its devices and resources:
+
+1. Select **Administration** > **Your application**.
+1. Select **Delete**.
+
+## Next Steps
+
+In this quickstart, you built a custom image that contains the Azure FreeRTOS middleware sample code. Then you flashed the image to the STM DevKit device. You also used the IoT Central portal to create Azure resources, connect the STM DevKit securely to Azure, view telemetry, and send messages.
+
+As a next step, explore the following articles to learn how to work with embedded devices and connect them to Azure IoT.
+
+> [!div class="nextstepaction"]
+> [Azure FreeRTOS middleware samples](https://github.com/Azure-Samples/iot-middleware-freertos-samples)
+> [!div class="nextstepaction"]
+> [Azure RTOS embedded development quickstarts](quickstart-devkit-mxchip-az3166.md)
+> [!div class="nextstepaction"]
+> [Azure IoT device development documentation](./index.yml)
iot-hub-device-update Device Update Azure Real Time Operating System https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub-device-update/device-update-azure-real-time-operating-system.md
# Device Update for Azure IoT Hub tutorial using Azure Real Time Operating System (RTOS)
-This tutorial will walk through how to create the Device Update for IoT Hub Agent in Azure RTOS NetX Duo. It also provides simple APIs for developers to integrate the Device Update capability in their application. Explore [samples](https://github.com/azure-rtos/samples/tree/PublicPreview/ADU) of key semiconductors evaluation boards that include the get started guides to learn configure, build, and deploy the over-the-air (OTA) updates to the devices.
+This tutorial walks through how to create the Device Update for IoT Hub Agent in Azure RTOS NetX Duo. It also provides simple APIs for developers to integrate the Device Update capability in their application. Explore [samples](https://github.com/azure-rtos/samples/tree/PublicPreview/ADU) of key semiconductors evaluation boards that include the get started guides to learn configure, build, and deploy the over-the-air (OTA) updates to the devices.
-In this tutorial you will learn how to:
+In this tutorial you learn how to:
> [!div class="checklist"] > * Get started > * Tag your device
Learn more about [Azure RTOS](/azure/rtos/).
2. Select the Updates option under "Device management" from the left-hand navigation bar. 3. Select the Groups tab at the top of the page. 4. Select the Add button to create a new group.
-5. Select the IoT Hub tag you created in the previous step from the list. Select Create update group.
+5. Select the IoT Hub tag that you created in the previous step from the list. Select Create update group.
:::image type="content" source="media/create-update-group/select-tag.PNG" alt-text="Screenshot showing tag selection." lightbox="media/create-update-group/select-tag.PNG":::
Learn more about [Azure RTOS](/azure/rtos/).
## Deploy new firmware
-1. Once the group is created, you should see a new update available for your device group, with a link to the update under Pending Updates. You may need to Refresh once.
+1. Once the group is created, you should see a new update available for your device group, with a link to the update under Pending Updates. You might need to Refresh once.
2. Click on the available update.
-3. Confirm the correct group is selected as the target group. Schedule your deployment, then select Deploy update.
+3. Confirm that the correct group is selected as the target group. Schedule your deployment, then select Deploy update.
:::image type="content" source="media/deploy-update/select-update.png" alt-text="Select update" lightbox="media/deploy-update/select-update.png":::
Learn more about [Azure RTOS](/azure/rtos/).
:::image type="content" source="media/deploy-update/deployments-tab.png" alt-text="Deployments tab" lightbox="media/deploy-update/deployments-tab.png":::
-2. Select the deployment you created to view the deployment details.
+2. Select the deployment that you created to view the deployment details.
:::image type="content" source="media/deploy-update/deployment-details.png" alt-text="Deployment details" lightbox="media/deploy-update/deployment-details.png"::: 3. Select Refresh to view the latest status details. Continue this process until the status changes to Succeeded.
-You have now completed a successful end-to-end image update using Device Update for IoT Hub on a Raspberry Pi 3 B+ device.
+You have now completed a successful end-to-end image update using Device Update for IoT Hub on an Azure RTOS embedded device.
## Cleanup resources
-When no longer needed cleanup your device update account, instance, IoT Hub and IoT device.
+When no longer neededn clean up your device update account, instance, IoT Hub, and IoT device.
## Next steps
iot-hub-device-update Update Manifest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub-device-update/update-manifest.md
## Overview
-Device Update for IoT Hub uses an _update manifest_ to communicate actions and also metadata supporting those actions through the
+Device Update for IoT Hub uses an _update manifest_ to communicate actions and metadata that supports those actions through the
[AzureDeviceUpdateCore.OrchestratorMetadata:4](./device-update-plug-and-play.md)interface properties. This document describes the fundamentals of how the `updateManifest` property, in the `AzureDeviceUpdateCore.OrchestratorMetadata:4` interface, is used by the Device Update Agent. The `AzureDeviceUpdateCore.OrchestratorMetadata:4` interface properties are sent from the Device Update for IoT Hub service to the Device Update Agent. The `updateManifest` is a serialized JSON Object that is parsed by the Device Update Agent.
+The update manifest is auto generated after creation of an import manifest. For more information on how to generate an import manifest, [see here](./import-update.md).
+ ### An example update manifest ```JSON
Device Update for IoT Hub uses an _update manifest_ to communicate actions and a
The purpose of the update manifest is to describe the contents of an update, namely its identity, type, installed criteria, and update file metadata. In addition, the update manifest is cryptographically signed to
-allow the Device Update Agent to verify its authenticity. Refer to the [Device Update security](./device-update-security.md) documentation for more information on how the update manifest is used to securely import content.
+allow the Device Update Agent to verify its authenticity. For more information, see the document on [Device Update security](./device-update-security.md).
## Import manifest vs update manifest
-It is important to understand the differences between the import manifest and the update manifest concepts in Device Update for IoT Hub.
+It is important to understand the differences between the import manifest and the update manifest.
* The [import manifest](./import-concepts.md) is created by whoever creates the corresponding update. It describes the contents of the update that will be imported into Device Update for IoT Hub. * The update manifest is automatically generated by the Device Update for IoT Hub service, using some of the properties that were defined in the import manifest. It is used to communicate relevant information to the Device Update Agent during the update process.
to determine compatible devices for the update.
### updateType
-Represents the type of update which is handled by a specific type of update handler. It follows the form
+Represents the type of update that is handled by a specific type of update handler. It follows the form
of `microsoft/swupdate:1` for an image-based update and `microsoft/apt:1` for a package-based update (see `Update Handler Types` section below). ### installedCriteria
for each update type supported by Device Update for IoT Hub.
### files
-Tells the Device Update Agent which files to download, and the hash that will be used to use to verify the files were downloaded correctly.
+Tells the Device Update Agent which files to download, and the hash that will be used to verify that the files downloaded correctly.
Here's a closer look at the `files` property contents: ```json
iot-hub Iot Hub Message Enrichments Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-message-enrichments-overview.md
The **value** can be any of the following examples:
* Information from the device twin, such as its path. Examples would be *$twin.tags.field* and *$twin.tags.latitude*. > [!NOTE]
- > At this time, only $iothubname, $twin.tags, $twin.properties.desired, and $twin.properties.reported are supported variables for message enrichment.
+ > At this time, only $iothubname, $twin.tags, $twin.properties.desired, and $twin.properties.reported are supported variables for message enrichment. Additionally, only primitive types are supported for enrichments. Messages cannot be enriched with object types.
Message Enrichments are added as application properties to messages sent to chosen endpoint(s).
key-vault Certificate Access Control https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/certificate-access-control.md
- Permissions for privileged operations - **purge**: Purge (permanently delete) a deleted certificate
-For more information, see the [Certificate operations in the Key Vault REST API reference](/rest/api/keyvault). For information on establishing permissions, see [Vaults - Update Access Policy](/rest/api/keyvault/vaults/updateaccesspolicy).
+For more information, see the [Certificate operations in the Key Vault REST API reference](/rest/api/keyvault). For information on establishing permissions, see [Vaults - Update Access Policy](/rest/api/keyvault/keyvault/vaults/update-access-policy).
## Troubleshoot You may see error due to missing access policy. For example ```Error type : Access denied or user is unauthorized to create certificate```
key-vault Create Certificate Signing Request https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/create-certificate-signing-request.md
Example
- [Authentication, requests, and responses](../general/authentication-requests-and-responses.md) - [Key Vault Developer's Guide](../general/developers-guide.md) - [Azure Key Vault REST API reference](/rest/api/keyvault)-- [Vaults - Create or Update](/rest/api/keyvault/vaults/createorupdate)-- [Vaults - Update Access Policy](/rest/api/keyvault/vaults/updateaccesspolicy)
+- [Vaults - Create or Update](/rest/api/keyvault/keyvault/vaults/create-or-update)
+- [Vaults - Update Access Policy](/rest/api/keyvault/keyvault/vaults/update-access-policy)
key-vault How To Export Certificate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/how-to-export-certificate.md
After a Key Vault certificate is created, you can retrieve it from the addressab
- **Exportable**: The policy used to create the certificate indicates the key is exportable. - **Non-exportable**: The policy used to create the certificate indicates the key is non-exportable. In this case, the private key isn't part of the value when it's retrieved as a secret.
-Supported keytypes: RSA, RSA-HSM, EC, EC-HSM, oct (listed [here](/rest/api/keyvault/createcertificate/createcertificate#jsonwebkeytype))
+Supported keytypes: RSA, RSA-HSM, EC, EC-HSM, oct (listed [here](/rest/api/keyvault/certificates/create-certificate/create-certificate#jsonwebkeytype))
Exportable is only allowed with RSA, EC. HSM keys would be non-exportable. See [About Azure Key Vault certificates](./about-certificates.md#exportable-or-non-exportable-key) for more information.
key-vault How To Integrate Certificate Authority https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/how-to-integrate-certificate-authority.md
Error message: "Please perform a merge to complete this certificate request."
Merge the CSR signed by the certificate authority to complete the request. For information about merging a CSR, see [Create and merge a CSR](./create-certificate-signing-request.md).
-For more information, see [Certificate operations in the Key Vault REST API reference](/rest/api/keyvault). For information on establishing permissions, see [Vaults - Create or update](/rest/api/keyvault/vaults/createorupdate) and [Vaults - Update access policy](/rest/api/keyvault/vaults/updateaccesspolicy).
+For more information, see [Certificate operations in the Key Vault REST API reference](/rest/api/keyvault). For information on establishing permissions, see [Vaults - Create or update](/rest/api/keyvault/keyvault/vaults/create-or-update) and [Vaults - Update access policy](/rest/api/keyvault/keyvault/vaults/update-access-policy).
## Frequently asked questions
key-vault Tutorial Import Certificate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/certificates/tutorial-import-certificate.md
When no longer needed, delete the resource group, which deletes the Key Vault an
In this tutorial, you created a Key Vault and imported a certificate in it. To learn more about Key Vault and how to integrate it with your applications, continue on to the articles below. - Read more about [Managing certificate creation in Azure Key Vault](./create-certificate-scenarios.md)-- See examples of [Importing Certificates Using REST APIs](/rest/api/keyvault/importcertificate/importcertificate)
+- See examples of [Importing Certificates Using REST APIs](/rest/api/keyvault/certificates/import-certificate/import-certificate)
- Review the [Key Vault security overview](../general/security-features.md)
key-vault Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/general/whats-new.md
Here's what's new with Azure Key Vault. New features and improvements are also announced on the [Azure updates Key Vault channel](https://azure.microsoft.com/updates/?category=security&query=Key%20vault).
+## January 2022
+
+Azure Key Vault service throughput limits have been increased to serve double its previous quota for each vault to help ensure high performance for applications. That is, for secret GET and RSA 2,048-bit software keys, you'll receive 4,000 GET transactions per 10 seconds vs 2,000 per 10 seconds previously. The service quotas are specific to operation type and the entire list can be accessed in [Azure Key Vault Service Limits](https://docs.microsoft.com/azure/key-vault/general/service-limits).
+
+For Azure update announcement, see [General availability: Azure Key Vault increased service limits for all its customers] (https://azure.microsoft.com/updates/azurekeyvaultincreasedservicelimits/)
++ ## December 2021 Automated encryption key rotation in Key Vault is now in preview. You can set a rotation policy on a key to schedule automated rotation and configure expiry notifications through Event Grid integration.
First preview version (version 2014-12-08-preview) was announced on January 8, 2
## Next steps
-If you have additional questions, please contact us through [support](https://azure.microsoft.com/support/options/).
+If you have additional questions, please contact us through [support](https://azure.microsoft.com/support/options/).
key-vault Quick Create Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/keys/quick-create-template.md
More Azure Key Vault template samples can be found in [Azure Quickstart Template
|Parameter |Definition | ||| |**keyOps** | Specifies operations that can be performed by using the key. If you don't specify this parameter, all operations can be performed. The acceptable values for this parameter are a comma-separated list of key operations as defined by the [JSON Web Key (JWK) specification](https://tools.ietf.org/html/draft-ietf-jose-json-web-key-41): <br> `["sign", "verify", "encrypt", "decrypt", " wrapKey", "unwrapKey"]` |
-|**CurveName** | Elliptic curve (EC) name for EC key type. See [JsonWebKeyCurveName](/rest/api/keyvault/createkey/createkey#jsonwebkeycurvename) |
-|**Kty** | The type of key to create. For valid values, see [JsonWebKeyType](/rest/api/keyvault/createkey/createkey#jsonwebkeytype) |
+|**CurveName** | Elliptic curve (EC) name for EC key type. See [JsonWebKeyCurveName](/rest/api/keyvault/keys/create-key/create-key#jsonwebkeycurvename) |
+|**Kty** | The type of key to create. For valid values, see [JsonWebKeyType](/rest/api/keyvault/keys/create-key/create-key#jsonwebkeytype) |
|**Tags** | Application-specific metadata in the form of key-value pairs. | |**nbf** | Specifies the time, as a DateTime object, before which the key can't be used. The format would be Unix time stamp (the number of seconds after Unix Epoch on January 1st, 1970 at UTC). | |**exp** | Specifies the expiration time, as a DateTime object. The format would be Unix time stamp (the number of seconds after Unix Epoch on January 1st, 1970 at UTC). |
load-balancer Gateway Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/gateway-overview.md
Gateway Load Balancer consists of the following components:
## Pricing
-There's no charge for Gateway Load Balancer during preview.
-
-For pricing that will be effective during the general availability release, see [Load Balancer pricing](https://azure.microsoft.com/pricing/details/load-balancer/).
+For pricing see [Load Balancer pricing](https://azure.microsoft.com/pricing/details/load-balancer/).
## Limitations
load-balancer Load Balancer Custom Probe Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/load-balancer-custom-probe-overview.md
The available protocols depend on the Load Balancer SKU used:
TCP probes initiate a connection by performing a three-way open TCP handshake with the defined port. TCP probes terminate a connection with a four-way close TCP handshake.
-The minimum probe interval is 5 seconds and the minimum number of unhealthy responses is 2. The total duration of all intervals cannot exceed 120 seconds.
+The minimum probe interval is 5 seconds and cannot exceed 120 seconds.
A TCP probe fails when: * The TCP listener on the instance doesn't respond at all during the timeout period. A probe is marked down based on the number of timed-out probe requests, which were configured to go unanswered before marking down the probe.
The following illustrates how you could express this kind of probe configuration
>[!NOTE] >HTTPS probe is only available for [Standard Load Balancer](./load-balancer-overview.md).
-HTTP and HTTPS probes build on the TCP probe and issue an HTTP GET with the specified path. Both of these probes support relative paths for the HTTP GET. HTTPS probes are the same as HTTP probes with the addition of a Transport Layer Security (TLS, formerly known as SSL) wrapper. The health probe is marked up when the instance responds with an HTTP status 200 within the timeout period. The health probe attempts to check the configured health probe port every 15 seconds by default. The minimum probe interval is 5 seconds. The total duration of all intervals cannot exceed 120 seconds.
+HTTP and HTTPS probes build on the TCP probe and issue an HTTP GET with the specified path. Both of these probes support relative paths for the HTTP GET. HTTPS probes are the same as HTTP probes with the addition of a Transport Layer Security (TLS, formerly known as SSL) wrapper. The health probe is marked up when the instance responds with an HTTP status 200 within the timeout period. The health probe attempts to check the configured health probe port every 15 seconds by default. The minimum probe interval is 5 seconds and cannot exceed 120 seconds.
HTTP / HTTPS probes can also be useful to implement your own logic to remove instances from load balancer rotation if the probe port is also the listener for the service itself. For example, you might decide to remove an instance if it's above 90% CPU and return a non-200 HTTP status.
load-balancer Load Balancer Standard Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/load-balancer-standard-availability-zones.md
Azure Load Balancer supports availability zones scenarios. You can use Standard Load Balancer to increase availability throughout your scenario by aligning resources with, and distribution across zones. Review this document to understand these concepts and fundamental scenario design guidance.
-A Load Balancer can either be **zone redundant, zonal,** or **non-zonal**. To configure the zone related properties (mentioned above) for your load balancer, select the appropriate type of frontend needed.
+A Load Balancer can either be **zone redundant, zonal,** or **non-zonal**. To configure the zone-related properties (mentioned above) for your load balancer, select the appropriate type of frontend needed.
## Zone redundant
The frontend's IP address is served simultaneously by multiple independent infra
You can choose to have a frontend guaranteed to a single zone, which is known as a *zonal*. This scenario means any inbound or outbound flow is served by a single zone in a region. Your frontend shares fate with the health of the zone. The data path is unaffected by failures in zones other than where it was guaranteed. You can use zonal frontends to expose an IP address per Availability Zone.
-Additionally, the use of zonal frontends directly for load balanced endpoints within each zone is supported. You can use this configuration to expose per zone load-balanced endpoints to individually monitor each zone. For public endpoints, you can integrate them with a DNS load-balancing product like [Traffic Manager](../traffic-manager/traffic-manager-overview.md) and use a single DNS name.
+Additionally, the use of zonal frontends directly for load-balanced endpoints within each zone is supported. You can use this configuration to expose per zone load-balanced endpoints to individually monitor each zone. For public endpoints, you can integrate them with a DNS load-balancing product like [Traffic Manager](../traffic-manager/traffic-manager-overview.md) and use a single DNS name.
<p align="center"> <img src="./media/az-zonal/zonal-lb-1.svg" alt="Figure depicts three zonal standard load balancers each directing traffic in a zone to three different subnets in a zonal configuration." width="512" title="Virtual Network NAT">
Load Balancers can also be created in a non-zonal configuration by use of a "no-
## <a name="design"></a> Design considerations
-Now that you understand the zone related properties for Standard Load Balancer, the following design considerations might help as you design for high availability.
+Now that you understand the zone-related properties for Standard Load Balancer, the following design considerations might help as you design for high availability.
### Tolerance to zone failure
Members in the backend pool of a load balancer are normally associated with a si
### Multiple frontends
-Using multiple frontends allow you to load balance traffic on more than one port and/or IP address. When designing your architecture, it is important to account for the way zone redundancy and multiple frontends can interact. Note that if the goal is to always have every frontend be resilient to failure, then all IP addresses assigned as frontends must be zone-redundant. If a set of frontends are intended to be associated with a single zone, then every IP address for that set must be associated with that specific zone. It is not required to have a load balancer for each zone; rather, each zonal frontend (or set of zonal frontends) could be associated with virtual machines in the backend pool that are part of that specific availability zone.
+Using multiple frontends allow you to load balance traffic on more than one port and/or IP address. When designing your architecture, it is important to account for the way zone redundancy and multiple frontends can interact. Note that if the goal is to always have every frontend be resilient to failure, then all IP addresses assigned as frontends must be zone-redundant. If a set of frontends is intended to be associated with a single zone, then every IP address for that set must be associated with that specific zone. It is not required to have a load balancer for each zone; rather, each zonal frontend (or set of zonal frontends) could be associated with virtual machines in the backend pool that are part of that specific availability zone.
### Transition between regional zonal models
-In the case where a region is augmented to have [availability zones](../availability-zones/az-overview.md) would remain non-zonal. In order to ensure your architecture can take advantage of the new zones, it is recommended that new frontend IPs be created, and the appropriate rules and configurations be replicated to utilizes these new public IPs.
+In the case where a region is augmented to have [availability zones](../availability-zones/az-overview.md), any existing Public IPs (e.g., used for Load Balancer frontends) would remain non-zonal. In order to ensure your architecture can take advantage of the new zones, it is recommended that new frontend IPs be created, and the appropriate rules and configurations be replicated to utilize these new public IPs.
### Control vs data plane implications
machine-learning Component Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/component-reference/component-reference.md
Title: "Algorithm & component reference"
-description: Learn about the Azure Machine Learning designer components you can use to create your own machine learning projects.
+description: Learn about the Azure Machine Learning designer components that you can use to create your own machine learning projects.
For help with choosing algorithms, see
| | | | | Data Input and Output | Move data from cloud sources into your pipeline. Write your results or intermediate data to Azure Storage, SQL Database, or Hive, while running a pipeline, or use cloud storage to exchange data between pipelines. | [Enter Data Manually](enter-data-manually.md) <br/> [Export Data](export-data.md) <br/> [Import Data](import-data.md) | | Data Transformation | Operations on data that are unique to machine learning, such as normalizing or binning data, dimensionality reduction, and converting data among various file formats.| [Add Columns](add-columns.md) <br/> [Add Rows](add-rows.md) <br/> [Apply Math Operation](apply-math-operation.md) <br/> [Apply SQL Transformation](apply-sql-transformation.md) <br/> [Clean Missing Data](clean-missing-data.md) <br/> [Clip Values](clip-values.md) <br/> [Convert to CSV](convert-to-csv.md) <br/> [Convert to Dataset](convert-to-dataset.md) <br/> [Convert to Indicator Values](convert-to-indicator-values.md) <br/> [Edit Metadata](edit-metadata.md) <br/> [Group Data into Bins](group-data-into-bins.md) <br/> [Join Data](join-data.md) <br/> [Normalize Data](normalize-data.md) <br/> [Partition and Sample](partition-and-sample.md) <br/> [Remove Duplicate Rows](remove-duplicate-rows.md) <br/> [SMOTE](smote.md) <br/> [Select Columns Transform](select-columns-transform.md) <br/> [Select Columns in Dataset](select-columns-in-dataset.md) <br/> [Split Data](split-data.md) |
-| Feature Selection | Select a subset of relevant, useful features to use in building an analytical model. | [Filter Based Feature Selection](filter-based-feature-selection.md) <br/> [Permutation Feature Importance](permutation-feature-importance.md) |
+| Feature Selection | Select a subset of relevant, useful features to use to build an analytical model. | [Filter Based Feature Selection](filter-based-feature-selection.md) <br/> [Permutation Feature Importance](permutation-feature-importance.md) |
| Statistical Functions | Provide a wide variety of statistical methods related to data science. | [Summarize Data](summarize-data.md)| ## Machine learning algorithms
For help with choosing algorithms, see
| | | | | Regression | Predict a value. | [Boosted Decision Tree Regression](boosted-decision-tree-regression.md) <br/> [Decision Forest Regression](decision-forest-regression.md) <br/> [Fast Forest Quantile Regression](fast-forest-quantile-regression.md) <br/> [Linear Regression](linear-regression.md) <br/> [Neural Network Regression](neural-network-regression.md) <br/> [Poisson Regression](poisson-regression.md) <br/>| | Clustering | Group data together.| [K-Means Clustering](k-means-clustering.md)
-| Classification | Predict a class. Choose from binary (two-class) or multiclass algorithms.| [Multiclass Boosted Decision Tree](multiclass-boosted-decision-tree.md) <br/> [Multiclass Decision Forest](multiclass-decision-forest.md) <br/> [Multiclass Logistic Regression](multiclass-logistic-regression.md) <br/> [Multiclass Neural Network](multiclass-neural-network.md) <br/> [One vs. All Multiclass](one-vs-all-multiclass.md) <br/> [One vs. One Multiclass](one-vs-one-multiclass.md) <br/>[Two-Class Averaged Perceptron](two-class-averaged-perceptron.md) <br/> [Two-Class Boosted Decision Tree](two-class-boosted-decision-tree.md) <br/> [Two-Class Decision Forest](two-class-decision-forest.md) <br/> [Two-Class Logistic Regression](two-class-logistic-regression.md) <br/> [Two-Class Neural Network](two-class-neural-network.md) <br/> [Two Class Support Vector Machine](two-class-support-vector-machine.md) |
+| Classification | Predict a class. Choose from binary (two-class) or multiclass algorithms.| [Multiclass Boosted Decision Tree](multiclass-boosted-decision-tree.md) <br/> [Multiclass Decision Forest](multiclass-decision-forest.md) <br/> [Multiclass Logistic Regression](multiclass-logistic-regression.md) <br/> [Multiclass Neural Network](multiclass-neural-network.md) <br/> [One vs. All Multiclass](one-vs-all-multiclass.md) <br/> [One vs. One Multiclass](one-vs-one-multiclass.md) <br/>[Two-Class Averaged Perceptron](two-class-averaged-perceptron.md) <br/> [Two-Class Boosted Decision Tree](two-class-boosted-decision-tree.md) <br/> [Two-Class Decision Forest](two-class-decision-forest.md) <br/> [Two-Class Logistic Regression](two-class-logistic-regression.md) <br/> [Two-Class Neural Network](two-class-neural-network.md) <br/> [Two Class Support Vector Machine](two-class-support-vector-machine.md) |
## Components for building and evaluating models
For help with choosing algorithms, see
| Recommendation | Build recommendation models. | [Evaluate Recommender](evaluate-recommender.md) <br/> [Score SVD Recommender](score-svd-recommender.md) <br/> [Score Wide and Deep Recommender](score-wide-and-deep-recommender.md)<br/> [Train SVD Recommender](train-SVD-recommender.md) <br/> [Train Wide and Deep Recommender](train-wide-and-deep-recommender.md)| | Anomaly Detection | Build anomaly detection models. | [PCA-Based Anomaly Detection](pca-based-anomaly-detection.md) <br/> [Train Anomaly Detection Model](train-anomaly-detection-model.md) | - ## Web service
-Learn about the [web service components](web-service-input-output.md) which are necessary for real-time inference in Azure Machine Learning designer.
+Learn about the [web service components](web-service-input-output.md), which are necessary for real-time inference in Azure Machine Learning designer.
## Error messages
-Learn about the [error messages and exception codes](designer-error-codes.md) you might encounter using components in Azure Machine Learning designer.
+Learn about the [error messages and exception codes](designer-error-codes.md) that you might encounter using components in Azure Machine Learning designer.
## Next steps
machine-learning Import Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/component-reference/import-data.md
Title: "Import Data: Component Reference"
-description: Learn how to use the Import Data component in Azure Machine Learning to load data into a machine learning pipeline from existing cloud data services.
+description: Learn how to use the Import Data component in Azure Machine Learning to load data into a machine learning pipeline from existing cloud data services.
The **Import Data** component support read data from following sources:
- Azure SQL Database - Azure PostgreSQL
-Before using cloud storage, you need to register a datastore in your Azure Machine Learning workspace first. For more information, see [How to Access Data](../how-to-access-data.md).
+Before using cloud storage, you must register a datastore in your Azure Machine Learning workspace first. For more information, see [How to Access Data](../how-to-access-data.md).
After you define the data you want and connect to the source, **[Import Data](./import-data.md)** infers the data type of each column based on the values it contains, and loads the data into your designer pipeline. The output of **Import Data** is a dataset that can be used with any designer pipeline.
If your source data changes, you can refresh the dataset and add new data by rer
1. Select **Data source**, and choose the data source type. It could be HTTP or datastore.
- If you choose datastore, you can select existing datastores that already registered to your Azure Machine Learning workspace or create a new datastore. Then define the path of data to import in the datastore. You can easily browse the path by click **Browse Path**
- ![Screenshot shows the Browse path link which opens the Path selection dialog box.](media/module/import-data-path.png)
+ If you choose datastore, you can select existing datastores that are already registered to your Azure Machine Learning workspace or create a new datastore. Then define the path of data to import in the datastore. You can easily browse the path by selecting **Browse Path**.
+
+ :::image type="content" source="media/module/import-data-path.png" alt-text="Screenshot shows the Browse path link which opens the Path selection dialog box." lightbox ="media/module/import-data-path.png":::
> [!NOTE] > **Import Data** component is for **Tabular** data only.
If your source data changes, you can refresh the dataset and add new data by rer
1. Select the preview schema to filter the columns you want to include. You can also define advanced settings like Delimiter in Parsing options.
- ![import-data-preview](media/module/import-data.png)
+ :::image type="content" source="media/module/import-data.png" alt-text="Screenshot of the schema preview with Column 3, 4, 5 and 6 selected.":::
-1. The checkbox, **Regenerate output**, decides whether to execute the component to regenerate output at running time.
+1. The checkbox, **Regenerate output**, decides whether to execute the component to regenerate output at running time.
- It's by default unselected, which means if the component has been executed with the same parameters previously, the system will reuse the output from last run to reduce run time.
+ It's by default unselected, which means if the component has been executed with the same parameters previously, the system reuses the output from last run to reduce run time.
- If it is selected, the system will execute the component again to regenerate output. So select this option when underlying data in storage is updated, it can help to get the latest data.
+ If it is selected, the system executes the component again to regenerate output. So select this option when underlying data in storage is updated, it can help to get the latest data.
1. Submit the pipeline.
If your source data changes, you can refresh the dataset and add new data by rer
When import completes, right-click the output dataset and select **Visualize** to see if the data was imported successfully.
-If you want to save the data for reuse, rather than importing a new set of data each time the pipeline is run, select the **Register dataset** icon under the **Outputs+logs** tab in the right panel of the component. Choose a name for the dataset. The saved dataset preserves the data at the time of saving, the dataset is not updated when the pipeline is rerun, even if the dataset in the pipeline changes. This can be useful for taking snapshots of data.
+If you want to save the data for reuse, rather than importing a new set of data each time the pipeline is run, select the **Register dataset** icon under the **Outputs+logs** tab in the right panel of the component. Choose a name for the dataset. The saved dataset preserves the data at the time of saving. The dataset is not updated when the pipeline is rerun, even if the dataset in the pipeline changes. This can be useful for taking snapshots of data.
-After importing the data, it might need some additional preparations for modeling and analysis:
+After you import the data, it might need some additional preparations for modeling and analysis:
-- Use [Edit Metadata](./edit-metadata.md) to change column names, to handle a column as a different data type, or to indicate that some columns are labels or features.
+- Use [Edit Metadata](./edit-metadata.md) to change column names, handle a column as a different data type, or indicate that some columns are labels or features.
- Use [Select Columns in Dataset](./select-columns-in-dataset.md) to select a subset of columns to transform or use in modeling. The transformed or removed columns can easily be rejoined to the original dataset by using the [Add Columns](./add-columns.md) component.
After importing the data, it might need some additional preparations for modelin
## Limitations
-Due to datstore access limitation, if your inference pipeline contains **Import Data** component, it will be auto-removed when deploy to real-time endpoint.
+Due to datastore access limitation, if your inference pipeline contains **Import Data** component, it is auto-removed when deployed to real-time endpoint.
## Next steps
-See the [set of components available](component-reference.md) to Azure Machine Learning.
+See the [set of components available](component-reference.md) to Azure Machine Learning.
machine-learning How To Attach Arc Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-attach-arc-kubernetes.md
kubectl get pods -n azureml
``` ## Update Azure Machine Learning extension
-Use ```k8s-extension update``` CLI command to update the mutable properties of Azure Machine Learning extension. For more information, see the [`k8s-extension update` CLI command documentation](/cli/azure/k8s-extension?view=azure-cli-latest#az_k8s_extension_update&preserve-view=true).
+Use ```k8s-extension update``` CLI command to update the mutable properties of Azure Machine Learning extension. For more information, see the [`k8s-extension update` CLI command documentation](/cli/azure/k8s-extension#az_k8s_extension_update).
1. Azure Arc supports update of ``--auto-upgrade-minor-version``, ``--version``, ``--configuration-settings``, ``--configuration-protected-settings``. 2. For configurationSettings, only the settings that require update need to be provided. If the user provides all settings, they would be merged/overwritten with the provided values.
Use ```k8s-extension update``` CLI command to update the mutable properties of
## Delete Azure Machine Learning extension
-Use [`k8s-extension delete`](/cli/azure/k8s-extension?view=azure-cli-latest#az_k8s_extension_delete&preserve-view=true) CLI command to delete the Azure Machine Learning extension.
+Use [`k8s-extension delete`](/cli/azure/k8s-extension#az_k8s_extension_delete) CLI command to delete the Azure Machine Learning extension.
It takes around 10 minutes to delete all components deployed to the Kubernetes cluster. Run `kubectl get pods -n azureml` to check if all components were deleted.
machine-learning How To Custom Dns https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-custom-dns.md
Previously updated : 01/18/2022 Last updated : 02/01/2022
The following steps describe how this topology works:
**Azure Public regions**: - ```api.azureml.ms``` - ```notebooks.azure.net```
- - ```instances.azureml.us```
+ - ```instances.azureml.ms```
**Azure China regions**: - ```api.ml.azure.cn```
The following steps describe how this topology works:
**Azure Public regions**: - ```api.azureml.ms``` - ```notebooks.azure.net```
- - ```instances.azureml.us```
+ - ```instances.azureml.ms```
**Azure China regions**: - ```api.ml.azure.cn```
marketplace Azure App Marketing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/azure-app-marketing.md
This article describes additional options you can choose if youΓÇÖre selling you
Providing information on the **Co-sell with Microsoft** tab is entirely optional. But itΓÇÖs required to achieve _Co-sell Ready_ and _IP Co-sell Ready_ status. The Microsoft sales teams use this information to learn more about your solution when evaluating its fit for customer needs. The information you provide on this tab isn't available directly to customers.
-For details and instructions to configure the **Co-sell with Microsoft** tab, see [Co-sell option in the commercial marketplace](co-sell-configure.md).
+For details and instructions to configure the **Co-sell with Microsoft** tab, see [Co-sell option in the commercial marketplace](/partner-center/co-sell-configure?context=/azure/marketplace/context/context).
## Resell through CSPs
marketplace Azure Container Plan Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/azure-container-plan-overview.md
Select **Create** and continue below.
## Next steps - [+ Create new plan](azure-container-plan-setup.md), or-- Exit plan setup and continue with optional [Co-sell with Microsoft](./co-sell-overview.md), or
+- Exit plan setup and continue with optional [Co-sell with Microsoft](/partner-center/co-sell-overview?context=/azure/marketplace/context/context), or
- [Review and publish your offer](review-publish-offer.md)
marketplace Azure Container Plan Technical Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/azure-container-plan-technical-configuration.md
Select **Save draft**, then **← Plan overview** in the left-nav menu to retur
--> ## Next steps -- To **Co-sell with Microsoft** (optional), select it in the left-nav menu. For details, see [Co-sell partner engagement](./co-sell-overview.md).
+- To **Co-sell with Microsoft** (optional), select it in the left-nav menu. For details, see [Co-sell partner engagement](/partner-center/co-sell-overview?context=/azure/marketplace/context/context).
- To **Resell through CSPs** (Cloud Solution Partners, also optional), select it in the left-nav menu. For details, see [Resell through CSP Partners](cloud-solution-providers.md). - If you're not setting up either of these or you've finished, it's time to [Review and publish your offer](review-publish-offer.md).
marketplace Co Sell Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/co-sell-configure.md
- Title: Configure co-sell for a commercial marketplace offer | Azure Marketplace
-description: The information you provide on the Co-sell with Microsoft tab for your offer is used by Microsoft sales teams to learn more about your offer when evaluating its fit for customer needs.
------ Previously updated : 01/11/2022--
-# Configure co-sell for a commercial marketplace offer
-
-This article describes how to configure the **Co-sell with Microsoft** tab for a commercial marketplace offer. Providing information on this tab is entirely optional but it is required to achieve [Co-sell Ready and IP Co-sell incentive status](/legal/marketplace/certification-policies#3000-requirements-for-co-sell-status). The information you provide will be used by Microsoft sales teams to learn more about your offer when evaluating its fit for customer needs. This information is not available directly to customers. For more information about co-sell, see [Co-sell with Microsoft sellers and partners overview](./co-sell-overview.md) and [Co-sell with Microsoft](https://partner.microsoft.com/membership/co-sell-with-microsoft).
-
-The Co-sell option is available for the following offer types.
--- Azure Application-- Azure Container-- Azure Virtual Machine-- Consulting service-- Dynamics 365 apps on Dataverse and Power Apps-- Dynamics 365 Operations Apps-- Dynamics 365 Business Central-- IoT Edge Module-- Managed Service-- Power BI App-- Software as a service (SaaS)-
-## Go to the Co-sell with Microsoft tab
-
-1. Sign in to [Partner Center](https://partner.microsoft.com/dashboard/home).
-
-1. On the Home page, select the **Marketplace offers** tile.
-
- [ ![Illustrates the Partner Center Home page.](./media/workspaces/partner-center-home.png) ](./media/workspaces/partner-center-home.png#lightbox)
-
- > [!TIP]
- > If you donΓÇÖt see the **Marketplace offers** tile, [create a commercial marketplace account in Partner Center](create-account.md) and make sure your account is enrolled in the commercial marketplace program.
-
-1. On the Marketplace offers page, select the offer you want to co-sell.
-
- > [!NOTE]
- > You can configure co-sell for a new offer thatΓÇÖs not yet published or with an offer thatΓÇÖs already published.
-
-1. In the menu on the left, select **Co-sell with Microsoft**.
-
- [ ![Illustrates the Co-sell with Microsoft page.](./media/co-sell/co-sell-with-microsoft-tab-workspaces.png) ](./media/co-sell/co-sell-with-microsoft-tab-workspaces.png#lightbox)
-
-## Co-sell listings
-
-Co-sell listings help Microsoft sales teams market your offer to a wider audience. You must provide the following information to achieve co-sell ready status:
--- Microsoft platforms (select one or more)-- Segments (select one or more)-- Solution type (select one)-- Solution sub-area (select one or two)-
-### Select Microsoft platforms and segments
-
-1. Under **Listing**, select one or more Microsoft platforms that your offer is built with, extends, or integrates with.
-1. Select one or more market segments that your offer is targeting.
-
-### Select solution types
-
-Solution types help define the scenarios that your offer is designed to address.
--- From the **Select a solution type** list, select a solution type that best matches your offer. This table describes the available solution types.-
-***Table 1: Available solution types***
-
-| **Solution type** | **Description** |
-| :- | :-|
-| Device (hardware) | An offer that involves building or selling hardware from a device manufacturer. |
-| IP (application) | Apps or other copyrightable material licensed for the customer's use. For example, a CRM program that can be licensed and installed on-premises. |
-| Managed Service | Hands-on expertise for a cloud-based project, usually on an ongoing basis. For example, providing a platform and tools for running an online database, with ongoing management provided by the managed service provider. |
-| Service | Hands-on expertise for a specific one-time project, often delivered via consultants. For example, setting up a customer database for a customer (with the customer assuming responsibility for operating the database after delivery). |
-|||
-
-### Select solution areas
-
-Solution areas help to further define your solution. This helps Microsoft sales teams find and understand your solutionΓÇÖs value proposition. You must select at least one and up to a maximum of five solution areas for your offer. For each solution area, you can further choose up to five solution sub-areas.
-
-1. Select the **+ Add solution area (5 Max)** link.
-1. Select a solution area from the drop-down list that appears.
-1. Select at least one and up to five solution sub-areas. To select multiple sub-areas, use the `Ctrl` key (on Windows) or `Command` key (on macOS).
-1. To add another solution area, repeat steps 1 through 3.
-
-## Upload documents
-
-You must provide collateral documents that provide details about your offer. Microsoft sales teams use this information to evaluate whether your offer is a fit for customer needs in order to recommend and sell your offer. The more information you provide, the more information Microsoft sales teams will have to understand and promote your product.
-
-The supported file types are .pdf, .ppt, .pptx, .doc, .docx, .xls, .xlsx, .jpg, .png, and .mp4. Templates for some documents are provided in Table 2 below.
-
-> [!NOTE]
-> The **Solution/offer one-pager** and **Solution/offer pitch deck** are required to attain Co-sell Ready status. They are also prerequisites for some offers to be a Azure IP Co-sell incentive. The Reference architecture diagram is also required for Azure IP co-sell incentive status. The other documents described in this table are optional but recommended.
-
-***Table 2: Documents that support co-sell***
-
-| **Documents** | **Description** |
-| :- | :-|
-| *Solution/offer one-pager (Required)* | Drive awareness among potential customers with a professionally designed one-pager that showcases the value proposition of your solution.<br><br>You can use one of the relevant templates to provide a customer-ready description of your offering:<br><ul><li> [Microsoft Azure one-pager template](https://aka.ms/Customer-One-Pager_MicrosoftAzure)</li><li>[Microsoft Dynamics 365 one-pager template](https://aka.ms/Customer-One-Pager_MicrosoftDynamics365)</li> <li>[Microsoft 365 one-pager template](https://aka.ms/Customer-One-Pager_MicrosoftOffice365) </li><li>[Windows 10 one-pager template](https://aka.ms/Customer-One-Pager_Windows)</li></ul> <br> Microsoft sales teams may share this information with customers to help determine if your offering may be a good fit, and to ensure that it is customer ready. <br><br>A one-pager <i>should not</i> be longer than 10 pages. |
-| *Solution/offer pitch deck (Required)* | You can use the [Customer presentation template](https://aka.ms/GTMServices_CustomerPresentation) to create your pitch deck. This deck should reference the [Reference architecture diagram](reference-architecture-diagram.md). The purpose of this slide deck is to pitch your offer and its value proposition. After ensuring that your offer is customer ready, Microsoft sales teams may share this presentation with customers to articulate the value that your company and Microsoft bring when deploying a joint solution. The presentation should cover what your offer does, how it can help customers, what industries the offer is relevant for, and how it compares with competing solutions. |
-| *Customer case study* (Optional)| Use the [Case study template](https://aka.ms/GTM_Case_Study_Template) to create your customer case study. This information shows a potential customer how you and Microsoft have successfully deployed your offer in prior cases. |
-| *Verifiable customer wins* (Optional) | Provide specific examples of customer successes after your offer has been deployed. |
-| *Channel pitch deck* (Optional) | A slide deck with information that helps channel resellers learn more about your offer and get their sales teams ready to sell it. This deck typically includes an elevator pitch, information about target customers, questions to ask customers, talking points, and links to videos, documentation, and support information. |
-| *Reference architecture diagram* (Required for Azure IP co-sell incentive status) | A diagram that represents your offer and its relationship with Microsoft cloud services. It may also demonstrate how your offer meets the technical requirements for Azure IP Co-sell incentive status. [Learn more about the reference architecture diagram.](reference-architecture-diagram.md) |
-| *Other documents* (Optional) | You may upload up to five additional documents or videos to help Microsoft sales teams and channel resellers learn more about your offer, organization, and how it's different from other offers. |
-|||
--- After you create your documents, drag them to the appropriate box under **Documents** or select **browse your file(s)** to upload a document from your computer.-
- [![Illustrates the co-sell documentation section of the Co-sell with Microsoft tab.](./media/co-sell/co-sell-documents-section.png)](./media/co-sell/co-sell-documents-section.png#lightbox)
-
-## Product landing page
--- Under **Documents**, in the **Product landing page** box, enter the link to your product's site, where Microsoft sales teams and channel resellers can learn more about your offer and view the latest updates.-
-## Enter your contacts
-
-A contact for each geography in which your offer is available is required to achieve co-sell ready status. If you choose to make your offer available in the CSP program, this contact information is also available to channel resellers.
-
-Your contact information lets Microsoft sales teams and channel resellers request additional information from the appropriate resource in your organization. Contact information is available to all Microsoft sales teams.
-
-> [!NOTE]
-> It is critical that you keep your contact information up to date.
-
-1. To download the template to provide your contact information, under **Contacts**, select **Download contacts template (.csv)** as seen in this screenshot. If you previously uploaded contacts, you can export your existing list of contacts for an offer, and then make changes in that .CSV file.
-
- [![Illustrates the Contacts section of the Co-sell with Microsoft tab.](./media/co-sell/co-sell-contacts-section.png)](./media/co-sell/co-sell-contacts-section.png#lightbox)
-
-1. Open the .CSV file in an application such as Microsoft Excel, and then fill in each row with information about the contact.
-
- - Name (Required): The contact's name.
- - Email (Required): The contact's email address.
- - Job title (Required): Job title.
- - Role (Required): Use any of the following roles.
-
- ***Table 3: Description of roles***
-
- | **Role** | **Description** |
- | :- | :-|
- | Partner marketing | This role focuses on marketing your offer and collaborating on marketing efforts with Microsoft sales teams and channel resellers. The main point of contact for marketing engagements and offer listing content, such as product descriptions, images, and videos. |
- | Partner sales | This role focuses on selling your offer and collaborating on sales with Microsoft sales teams and channel resellers. Indicate at least one partner sales contact for each region in which you want your offer to be Co-sell ready. The same partner sales contact can cover multiple regions. |
- | Partner technical sales | Supports technical architecture and deployment considerations during the sales cycle, the post-sales integration, and deployment periods. |
- | Partner customer success manager | Typically supports customers post-deployment to help them get the most out of your offer and increase its usage within the customer's organization. |
- |||
-
- - Countries/Regions (Required): When filling out the template, use the two letter [Co-sell country and region codes](commercial-marketplace-co-sell-countries.md). If the contact covers all countries and regions, use the three-letter code "OOO". If a contact covers more than one country or region, enter each of the two letter codes separated by a comma. For example, enter "US, CA, FR" without quotation marks into the template.
-
- The countries and regions should reflect each contact's territory. Microsoft sales teams and channel resellers will use this information when requesting information or collaborating on sales within the specific country or region.
-
- - States/Provinces (Optional): When filling out the template, use the XX-XX format as listed in the [states, provinces, and territories tables](commercial-marketplace-co-sell-states.md).
-
-1. Save and close the .CSV file.
-
-1. To import the .CSV file, select the **Import contacts (.csv)** link.
- > [!NOTE]
- > Importing the .CSV file will overwrite any existing contacts.
-
-1. Select the .CSV file and then select **Open**. A message appears stating that the contacts have been successfully imported.
-
-## Save and republish the offer
-
-1. Select **Save draft** to save your changes before you continue.
-1. After you've completed all the required sections of the offer, you can submit it for review and publication. Select **Review and publish**.
-1. Do one of the following:
-
- - If you completed or updated the **Co-sell with Microsoft** tab for an offer that has been previously published live and you havenΓÇÖt updated any other tabs, we recommend that you only select the **Co-sell** check box.
-
- - If this is a new or draft offer that has never been published, we recommend that you select all boxes. You can optionally select **Compare** to compare the current version against the unpublished changes.
-
-1. To begin the validation phase, select **Publish**. Note that your offer wonΓÇÖt be published live until after the offer has been reviewed and then you select **Go live** after the validation phase is complete. If your offer was already published and you configured co-sell, then the offer remains live while we validate the co-sell status. For details about reviewing and publishing an offer, see [How to review and publish an offer to the commercial marketplace](review-publish-offer.md).
-
-> [!NOTE]
-> You no longer need to contact us to nominate your offer for co-sell. After you complete all required fields on the Co-sell with Microsoft page and republish your offer, we will review your offer to determine if it meets the requirements for co-sell status.
-
-## Next steps
--- For details about republishing an offer, see [How to review and publish an offer to the commercial marketplace](review-publish-offer.md).-- For information about commercial marketplace rewards and technical benefits, see [Your commercial marketplace benefits](gtm-your-marketplace-benefits.md).
marketplace Co Sell Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/co-sell-overview.md
- Title: Co-sell with Microsoft sales teams and partners overview
-description: The Microsoft Partner Center Co-sell program for partners can help you reach a vast customer base and generate new sales.
------ Previously updated : 12/03/2021--
-# Co-sell with Microsoft sales teams and partners overview
-
-Co-selling with Microsoft is defined as any collaborative engagement between Microsoft and our partner ecosystem. This process includes building demand, sales planning, sharing sales leads, accelerating partner-to-partner empowered selling, and delivering marketplace-led commerce.
-
-When you choose to co-sell an offer, you can work directly with Microsoft sales teams and Microsoft partners on joint selling opportunities. This unlocks benefits when selling through the commercial marketplace online stores: Azure Marketplace and Microsoft AppSource.
-
-Co-selling opportunities are the result of [acting on a lead](./partner-center-portal/commercial-marketplace-get-customer-leads.md), collaborating on it with Microsoft sales teams or Microsoft partners to provide a solution for a customer need.
-
-[![Diagram showing how Co-sell happens when sales leads are shared, accepted, and won against Microsoft-managed customers..](./media/marketplace-publishers-guide/marketplace-co-sell-v2.png)](./media/marketplace-publishers-guide/marketplace-co-sell-v2.png#lightbox)
-
-These solutions (also called offers) can include software built with your intellectual property (IP) and services that support Microsoft technologies.
-
-## Co-sell opportunities
-
-A co-sell opportunity is any type of collaboration with Microsoft sales teams, Microsoft partners, or both to sell products and solutions that meet customer needs. Some of the key categories of co-sell are:
--- **Co-sell with Microsoft sales teams** ΓÇô Work with one or more Microsoft sales teams to actively fulfill customer needs. This can include selling your offers, MicrosoftΓÇÖs offers, or both. You and Microsoft sales teams can identify and share customer opportunities in which your solutions may be a good fit.-- **Partner to Partner (P2P)** ΓÇô Work with another Microsoft partner to actively solve a customer problem.-- **Private deal** ΓÇô Share what you are independently working on with Microsoft so it will be reflected in the Microsoft reporting system for analysis and forecasting.-- **Solution Assessment (SA)** ΓÇô Work with partners who are vetted by the solution assessments business team to assess the technology needs of customers who are using or planning to use Microsoft technologies.-
-## Co-sell statuses
-
-These are the levels of co-sell status that can be applied to an offer.
-
-Co-sell statuses for Azure:
--- Not co-sell ready-- Co-sell ready-- Azure IP co-sell incentive-
-Co-sell statuses for business applications
-- Business Applications Co-sell Incentive Standard-- Business Applications Co-sell Incentive Premium -
-For details about the requirements to achieve these co-sell statuses, see [Co-sell requirements](co-sell-requirements.md).
-
-## Benefits of co-sell ready status
-
-Co-sell ready status exposes your solutions to Microsoft sales teams. Co-selling with Microsoft sales teams and Microsoft partners helps you reach a vast community of Microsoft-managed customers to collaborate on sales opportunities that accelerate your business growth.
-
-To learn how to achieve co-sell ready and Azure IP co-sell status, see [Co-sell ready and co-sell incentive requirements](co-sell-requirements.md).
-
-## Benefits of co-sell incentive status
-
-_Co-sell incentivized_ status includes _Azure IP Co-sell incentivized_ and _Business Applications Co-sell incentivized (Standard and Premium)_. These statuses incentivize Microsoft sales teams to sell your offer. To achieve this status, you must also achieve Co-sell ready status. Co-sell incentive status earns all the benefits of Co-sell ready status and can earn additional incentives for Microsoft sales teams, and be eligible for more commercial marketplace benefits.
-
-Azure IP Co-sell incentive status can be applied to these offer types:
--- Azure Application-- Azure Container-- Azure Virtual Machine-- IoT Edge module-- Software as a service (SaaS)-
-Business Applications Co-sell incentive (Standard and Premium) status can be applied to these offer types:
--- Dynamics 365 apps on Dataverse and Power Apps-- Dynamics 365 Operations Apps-
-Offers that achieve _Azure IP Co-sell incentivized_ status gain these commercial marketplace benefits:
--- Sales of your offer through Azure Marketplace will contribute towards customersΓÇÖ Azure consumption commitments. Eligible customers will see the offer marked as **Azure benefit eligible** in the Azure portal. To learn more about how the MACC program benefits customers and how they can find solutions that are enabled for MACC, see [Azure Consumption Commitment benefit](/marketplace/azure-consumption-commitment-benefit). For information about how publishers can get their transactable offer enrolled in MACC, see [Azure consumption commitment enrollment](azure-consumption-commitment-enrollment.md).-- Offers that achieve _Azure IP Co-sell incentivized_ status or are enrolled in the [Microsoft Business Applications ISV Connect Program](business-applications-isv-program.md) with Co-sell ready status will receive a **Microsoft Preferred solutions** badge on the offer listing page in the online stores: [Azure Marketplace](https://azuremarketplace.microsoft.com/) and [AppSource](https://appsource.microsoft.com/). After an offer achieves the corresponding status, it may take up to 30 days for the preferred solution badge to be displayed in the online store. The badge promotes an offer's quality, performance, and ability to address customer needs in a certain industry vertical or solution area.-
-To learn how to achieve co-sell ready and co-sell incentive status, see [Co-sell ready and Co-sell incentive requirements](co-sell-requirements.md).
-
-## Next steps
--- For information about requirements, see [Co-sell ready and co-sell incentive requirements](co-sell-requirements.md)-- To configure an offer for co-sell, see [Configure Co-sell for a commercial marketplace offer](co-sell-configure.md)-- To verify co-sell status, see [Verify Co-sell status for an offer](co-sell-status.md).-- Learn more about [Co-selling with Microsoft](https://partner.microsoft.com/membership/sell-with-microsoft).
marketplace Co Sell Requirements https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/co-sell-requirements.md
- Title: Co-sell requirements | Azure Marketplace
-description: Learn about the requirements an offer in the Microsoft commercial marketplace must meet to qualify for co-sell ready or co-sell incentive status.
------ Previously updated : 12/03/2021--
-# Co-sell requirements
-
-This article provides the requirements for the various levels of co-sell status. For the latest list of offer types that support co-sell, see [Configure Co-sell for a commercial marketplace offer](co-sell-configure.md). For an overview of co-sell, see [Co-sell with Microsoft sales teams and partners overview](co-sell-overview.md).
-
-This table shows all possible co-sell statuses:
-
-| Status | Comment |
-| | - |
-| Not co-sell ready | The minimum [requirements for Co-sell ready status](#requirements-for-co-sell-ready-status) have not been met. |
-| Co-sell ready | All [requirements for Co-sell ready status](#requirements-for-co-sell-ready-status) have been met. |
-| Azure IP Co-sell incentive | Co-sell ready requirements have been met in addition to [these additional requirements](#requirements-for-azure-ip-co-sell-incentive-status). |
-| Business Applications Co-sell incentive | This status applies to _Dynamics 365 apps on Dataverse and Power Apps_ offers in the [Microsoft Business Applications ISV Connect Program](business-applications-isv-program.md) and indicates that all [requirements for this status](#requirements-for-business-applications-co-sell-incentive-status) have been met. |
-|||
-
-## Requirements for Co-sell ready status
-
-> [!NOTE]
-> Any offer published through the commercial marketplace developer program in Partner Center is eligible to attain Co-sell Ready status, provided Co-sell Ready requirements are met. Office apps and add-ins are not eligible (e.g. Teams, SharePoint solutions, Outlook, Excel)
-
-For an offer to achieve co-sell ready status, you must meet the following requirements:
-
-**All partners**:
--- Have an MPN ID and an active [commercial marketplace account in Partner Center](create-account.md).-- Make sure you have a complete [business profile](/partner-center/create-a-marketing-profile) in Partner Center. As a qualified Microsoft partner, your business profile helps to showcase your business to customers who are looking for your unique solutions and expertise to address their business needs, resulting in [referrals](/partner-center/referrals).-- Complete the **Co-sell with Microsoft** tab and publish the offer to the commercial marketplace.-- Provide a sales contact for each co-sell eligible geography and required bill of materials.-
-**Services partners**:
--- For offers of the _Service solution_ type, you must have an active gold competency in any competency area.-
-**Business Applications ISVs**:
--- Dynamics 365 apps on Dataverse and Power Apps and Dynamics 365 Operations Apps solutions require ISV Connect enrollment.-
-### Complete the Co-sell with Microsoft tab
-
-When publishing or updating your offer, provide all the required information on the **Co-sell with Microsoft** tab as detailed in [Configure Co-sell for a commercial marketplace offer](./co-sell-configure.md). This includes providing the following documents:
--- Solution/offer one-pager-- Solution/offer pitch deck-
-We provide templates to help you create these documents. For more information about the required and optional information for the Co-sell with Microsoft tab, see [Configure Co-sell for a commercial marketplace offer](./co-sell-configure.md).
-
-### Publish your offer live
-
-To qualify for co-sell ready status, your offer or solution must be published live to at least one of the commercial marketplace online stores: Azure Marketplace or Microsoft AppSource. For information about publishing offers to the commercial marketplace, see [Publishing guide by offer type](publisher-guide-by-offer-type.md). If you havenΓÇÖt published an offer in the commercial marketplace before, make sure you have a [commercial marketplace account](create-account.md).
-
-## Requirements for Azure IP Co-sell incentive status
-
-Azure IP Co-sell incentive status applies to the following offer types:
--- Azure Application-- Azure Container-- Azure Virtual Machine-- IoT Edge module-- Software as a service (SaaS)-
-After achieving Co-sell ready status, there are three additional requirements to achieve Azure IP Co-sell incentive status:
-
-Requirement 1- achieve the following:
--- At the _organization level_, generate at least $100,000 USD of Azure Consumed Revenue threshold over the trailing 12-month period. This can be obtained through a combination of Azure solutions. If the offer is transactable in the commercial marketplace, you can meet this requirement by meeting a billed revenue threshold of $100,000 USD over the trailing 12-month period.-
-Requirement 2 - Pass the Microsoft technical validation for an Azure-based solution:
-- The technical validation must confirm that more than 50% of your offerΓÇÖs infrastructure uses repeatable IP code on Azure. Note that transactable Azure VMs and Azure Application solutions on the commercial marketplace will meet this requirement by default.-
-Requirement 3 ΓÇô Provide a reference architecture diagram:
-- Upload a reference architecture diagram with your Co-sell documents in Partner Center for review. For guidance on creating this diagram, see [Reference architecture diagram](reference-architecture-diagram.md). For information about uploading the diagram, see [Configure Co-sell for a commercial marketplace offer](./co-sell-configure.md).-
-## Requirements for Business Applications Co-sell incentive status
-
-This status applies to IP-based solutions built on Dynamics 365 apps on Dataverse and Power Apps and Dynamics 365 Operations Apps that are enrolled in the ISV Connect program. However, offers must also complete the requirements for Co-sell ready status (described above) in order for Microsoft sellers to be able to Co-sell the offer with you.
-
-## Next steps
--- [Configure Co-sell for a commercial marketplace offer](./co-sell-configure.md)
marketplace Co Sell Solution Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/co-sell-solution-migration.md
Ensure you have an active Microsoft Partner Network membership and are enrolled
## Publishing updates for attaining co-sell-ready status
-For your solution to be discoverable to Microsoft sellers and partners, it must meet the [co-sell ready requirements](./co-sell-overview.md). For a Microsoft seller to be a Co-sell incentive, your solution must meet the [incentive-eligible requirements](./co-sell-overview.md). Complete these requirements on the co-sell tab in Partner Center (see [this image](#action-2-merge) later in this article).
+For your solution to be discoverable to Microsoft sellers and partners, it must meet the [co-sell ready requirements](/partner-center/co-sell-overview?context=/azure/marketplace/context/context). For a Microsoft seller to be a Co-sell incentive, your solution must meet the [incentive-eligible requirements](/partner-center/co-sell-overview?context=/azure/marketplace/context/context). Complete these requirements on the co-sell tab in Partner Center (see [this image](#action-2-merge) later in this article).
> [!NOTE] > In commercial marketplace, your solutions are referred to as ΓÇ£offersΓÇ¥ throughout the publishing experience.
If you do not have an offer already in the commercial marketplace to merge a sol
> [!TIP] > We recommend that you *do not fill out* the data in the **Co-sell with Microsoft** tab. To save you time we will take care of populating this data for you with your existing collateral in OCP GTM during the merge process.
- After the merge is complete you can return to the Co-sell with Microsoft tab and make updates if needed. For more information, see [Configure co-sell for a commercial marketplace offer](./co-sell-configure.md).
+ After the merge is complete you can return to the Co-sell with Microsoft tab and make updates if needed. For more information, see [Configure co-sell for a commercial marketplace offer](/partner-center/co-sell-configure?context=/azure/marketplace/context/context).
1. When complete, select **Review and publish**. [ ![Co-Sell with Microsoft page is displayed with options highlighted.](media/co-sell-migrate/co-sell-with-ms-workspaces.png) ](media/co-sell-migrate/co-sell-with-ms-workspaces.png#lightbox)
Select this option when a solution in OCP GTM solutions is no longer relevant. Y
## Next steps - [Resell through CSP Partners](cloud-solution-providers.md)-- [Configure co-sell for a commercial marketplace offer](./co-sell-configure.md)
+- [Configure co-sell for a commercial marketplace offer](/partner-center/co-sell-configure?context=/azure/marketplace/context/context)
- View these [FAQs](https://partner.microsoft.com/resources/detail/co-sell-requirements-publish-commercial-marketplace-faq-pdf) (PDF)
marketplace Co Sell Status https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/co-sell-status.md
- Title: Verify co-sell status of a commercial marketplace offer | Azure Marketplace
-description: Learn how to verify the co-sell status of an offer in the Microsoft commercial marketplace.
------ Previously updated : 09/27/2021--
-# Verify Co-sell status of a commercial marketplace offer
-
-You can verify the Co-sell status for an offer on the **Offer overview** page of an offer in the Commercial Marketplace program of Partner Center.
-
-## Verify Co-sell status
-
-1. Sign in to [Partner Center](https://partner.microsoft.com/dashboard/home).
-1. On the Home page, select the **Marketplace offers** tile.
-
- [ ![Illustrates the Marketplace offers tile on the Partner Center Home page.](./media/workspaces/partner-center-home.png) ](./media/workspaces/partner-center-home.png#lightbox)
-
-1. In the **Offer alias** column, select the offer you want. The co-sell status is shown in the Marketplace Programs section of the page.
-
- [![Illustrates the co-sell status in the Marketplace Programs of the Overview page in Partner Center.](./media/co-sell/co-sell-status.png)](./media/co-sell/co-sell-status.png#lightbox)
-
-The following table shows all possible co-sell statuses. To learn about the requirements for each co-sell status, see [Co-sell requirements](co-sell-requirements.md).
-
-| Status | Comment |
-| | - |
-| Not co-sell ready | The minimum [requirements for Co-sell ready status](co-sell-requirements.md#requirements-for-co-sell-ready-status) have not been met. |
-| Co-sell ready | All [requirements for Co-sell ready status](co-sell-requirements.md#requirements-for-co-sell-ready-status) have been met. |
-| Azure IP Co-sell incentive | Co-sell ready requirements have been met in addition to [these additional requirements](co-sell-requirements.md#requirements-for-azure-ip-co-sell-incentive-status). |
-| Business Applications Co-sell incentive | This status applies to Dynamics 365 and Power Apps offers in the [ISV Connect Program](business-applications-isv-program.md) and indicates that all [requirements for this status](co-sell-requirements.md#requirements-for-business-applications-co-sell-incentive-status) have been met. |
-|||
-
-## Next steps
--- For information about Co-sell requirements, see [Co-sell ready requirements](co-sell-requirements.md).-- For help configuring an offer for Co-sell, see [Configure Co-sell for a commercial marketplace offer](co-sell-configure.md).
marketplace Commercial Marketplace Co Sell Countries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/commercial-marketplace-co-sell-countries.md
- Title: Co-sell country and region codes | Azure Marketplace
-description: Use these two-letter country/region codes when providing contact info on your offer's Co-sell page.
------ Previously updated : 04/27/2021--
-# Co-sell country and region codes
-
-Use these two-letter country/region codes when [providing contact info on your offer's Co-sell page](./co-sell-configure.md).
-
-If the contact covers all Countries/Regions, use the three letter code "OOO".
-
-If a contact covers more than one Country/Region, enter each of the two letter codes separated by a comma (for example, enter "US, CA, FR" without quotation marks).
-
-## Country/Region table
-
-| Country/Region Name | ISO-2 |
-|-|--|
-| Global contacts | OOO |
-| Afghanistan | AF |
-| Åland Islands | AX |
-| Albania | AL |
-| Algeria | DZ |
-| American Samoa | AS |
-| Andorra | AD |
-| Angola | AO |
-| Antarctica | AQ |
-| Antigua and Barbuda | AG |
-| Argentina | AR |
-| Armenia | AM |
-| Aruba | AW |
-| Australia | AU |
-| Austria | AT |
-| Azerbaijan | AZ |
-| Bahamas | BS |
-| Bahrain | BH |
-| Bangladesh | BD |
-| Barbados | BB |
-| Belarus | BY |
-| Belgium | BE |
-| Belize | BZ |
-| Benin | BJ |
-| Bermuda | BM |
-| Bhutan | BT |
-| Bolivia | BO |
-| Bonaire | BQ |
-| Bosnia and Herzegovina | BA |
-| Botswana | BW |
-| Bouvet Island | BV |
-| Brazil | BR |
-| British Indian Ocean Territory | IO |
-| British Virgin Islands | VG |
-| Brunei | BN |
-| Bulgaria | BG |
-| Burkina Faso | BF |
-| Burundi | BI |
-| Cabo Verde | CV |
-| Cambodia | KH |
-| Cameroon | CM |
-| Canada | CA |
-| Cayman Islands | KY |
-| Central African Republic | CF |
-| Chad | TD |
-| Czechia | CZ |
-| Chile | CL |
-| China | CN |
-| Christmas Island | CX |
-| Cocos (Keeling) Islands | CC |
-| Colombia | CO |
-| Comoros | KM |
-| Congo | CG |
-| Congo (DRC) | CD |
-| Cook Islands | CK |
-| Costa Rica | CR |
-| C├┤te d'Ivoire | CI |
-| Croatia | HR |
-| Cuba | CU |
-| Curaçao | CW |
-| Cyprus | CY |
-| Denmark | DK |
-| Djibouti | DJ |
-| Dominica | DM |
-| Dominican Republic | DO |
-| Ecuador | EC |
-| Egypt | EG |
-| El Salvador | SV |
-| Equatorial Guinea | GQ |
-| Eritrea | ER |
-| Estonia | EE |
-| eSwatini | SZ |
-| Ethiopia | ET |
-| Faroe Islands | FO |
-| Fiji | FJ |
-| Finland | FI |
-| France | FR |
-| French Guiana | GF |
-| French Polynesia | PF |
-| French Southern Territories | TF |
-| Gabon | GA |
-| Gambia | GM |
-| Georgia | GE |
-| Germany | DE |
-| Ghana | GH |
-| Gibraltar | GI |
-| Greece | GR |
-| Greenland | GL |
-| Grenada | GD |
-| Guadeloupe | GP |
-| Guam | GU |
-| Guatemala | GT |
-| Guernsey | GG |
-| Guinea | GN |
-| Guinea-Bissau | GW |
-| Guyana | GY |
-| Haiti | HT |
-| Heard Island and McDonald Islands | HM |
-| Honduras | HN |
-| Hong Kong SAR | HK |
-| Hungary | HU |
-| Iceland | IS |
-| India | IN |
-| Indonesia | ID |
-| Iran | IR |
-| Iraq | IQ |
-| Ireland | IE |
-| Isle of Man | IM |
-| Israel | IL |
-| Italy | IT |
-| Jamaica | JM |
-| Japan | JP |
-| Jersey | JE |
-| Jordan | JO |
-| Kazakhstan | KZ |
-| Kenya | KE |
-| Kiribati | KI |
-| Korea (South) | KR |
-| Kuwait | KW |
-| Kyrgyzstan | KG |
-| Laos | LA |
-| Latvia | LV |
-| Lebanon | LB |
-| Lesotho | LS |
-| Liberia | LR |
-| Libya | LY |
-| Liechtenstein | LI |
-| Lithuania | LT |
-| Luxembourg | LU |
-| Macao SAR | MO |
-| Madagascar | MG |
-| Malawi | MW |
-| Malaysia | MY |
-| Maldives | MV |
-| Mali | ML |
-| Malta | MT |
-| Marshall Islands | MH |
-| Martinique | MQ |
-| Mauritania | MR |
-| Mauritius | MU |
-| Mayotte | YT |
-| Mexico | MX |
-| Micronesia | FM |
-| Moldova | MD |
-| Monaco | MC |
-| Mongolia | MN |
-| Montenegro | ME |
-| Montserrat | MS |
-| Morocco | MA |
-| Mozambique | MZ |
-| Myanmar | MM |
-| Namibia | NA |
-| Nauru | NR |
-| Nepal | NP |
-| Netherlands | NL |
-| New Caledonia | NC |
-| New Zealand | NZ |
-| Nicaragua | NI |
-| Niger | NE |
-| Nigeria | NG |
-| Niue | NU |
-| Norfolk Island | NF |
-| North Korea | KP |
-| Northern Mariana Islands | MP |
-| North Macedonia | MK |
-| Norway | NO |
-| Oman | OM |
-| Pakistan | PK |
-| Palau | PW |
-| Palestinian Authority | PS |
-| Panama | PA |
-| Papua New Guinea | PG |
-| Paraguay | PY |
-| Peru | PE |
-| Philippines | PH |
-| Pitcairn Islands | PN |
-| Poland | PL |
-| Portugal | PT |
-| Puerto Rico | PR |
-| Qatar | QA |
-| Réunion | RE |
-| Romania | RO |
-| Russia | RU |
-| Rwanda | RW |
-| Saint Barthélemy | BL |
-| Saint Kitts and Nevis | KN |
-| Saint Lucia | LC |
-| Saint Martin | MF |
-| Saint Pierre and Miquelon | PM |
-| Saint Vincent and the Grenadines | VC |
-| Samoa | WS |
-| San Marino | SM |
-| São Tomé and Príncipe | ST |
-| Saudi Arabia | SA |
-| Senegal | SN |
-| Serbia | RS |
-| Seychelles | SC |
-| Sierra Leone | SL |
-| Singapore | SG |
-| Sint Maarten | SX |
-| Slovakia | SK |
-| Slovenia | SI |
-| Solomon Islands | SB |
-| Somalia | SO |
-| South Africa | ZA |
-| South Georgia and South Sandwich Islands | GS |
-| South Sudan | SS |
-| Spain | ES |
-| Sri Lanka | LK |
-| St Helena, Ascension, Tristan da Cunha | SH |
-| Sudan | SD |
-| Suriname | SR |
-| Svalbard | SJ |
-| Sweden | SE |
-| Switzerland | CH |
-| Syria | SY |
-| Taiwan | TW |
-| Tajikistan | TJ |
-| Tanzania | TZ |
-| Thailand | TH |
-| Timor-Leste | TL |
-| Togo | TG |
-| Tokelau | TK |
-| Tonga | TO |
-| Trinidad and Tobago | TT |
-| Tunisia | TN |
-| Turkey | TR |
-| Turkmenistan | TM |
-| Turks and Caicos Islands | TC |
-| Tuvalu | TV |
-| Uganda | UG |
-| Ukraine | UA |
-| United Arab Emirates | AE |
-| United Kingdom | GB |
-| United States | US |
-| Uruguay | UY |
-| U.S. Outlying Islands | UM |
-| U.S. Virgin Islands | VI |
-| Uzbekistan | UZ |
-| Vanuatu | VU |
-| Vatican City | VA |
-| Venezuela | VE |
-| Vietnam | VN |
-| Wallis and Futuna | WF |
-| Yemen | YE |
-| Zambia | ZM |
-| Zimbabwe | ZW |
-
-## Next steps
--- Learn about the [co-sell option in the commercial marketplace](./co-sell-configure.md).
marketplace Commercial Marketplace Co Sell States https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/commercial-marketplace-co-sell-states.md
- Title: Co-sell state and province codes in Azure Marketplace
-description: Get the available state and province codes when providing contact info on your offer's Co-sell page in Azure Marketplace.
------ Previously updated : 04/27/2021--
-# Co-sell state and province codes in Azure Marketplace
-
-This article lists the available state and province codes when providing contact info on your offer's Co-sell page. If applicable, use these codes to provide State/Province info when [providing contact info on your offer's Co-sell page](./co-sell-configure.md#enter-your-contacts).
-
-## US states and territories
-
-| State and territory name | Code |
-|-|--|
-| Alabama | US-AL |
-| Alaska | US-AK |
-| Arizona | US-AZ |
-| Arkansas | US-AR |
-| California | US-CA |
-| Colorado | US-CO |
-| Connecticut | US-CT |
-| Delaware | US-DE |
-| Florida | US-FL |
-| Georgia | US-GA |
-| Hawaii | US-HI |
-| Idaho | US-ID |
-| Illinois | US-IL |
-| Indiana | US-IN |
-| Iowa | US-IA |
-| Kansas | US-KS |
-| Kentucky | US-KY |
-| Louisiana | US-LA |
-| Maine | US-ME |
-| Maryland | US-MD |
-| Massachusetts | US-MA |
-| Michigan | US-MI |
-| Minnesota | US-MN |
-| Mississippi | US-MS |
-| Missouri | US-MO |
-| Montana | US-MT |
-| Nebraska | US-NE |
-| Nevada | US-NV |
-| New Hampshire | US-NH |
-| New Jersey | US-NJ |
-| New Mexico | US-NM |
-| New York | US-NY |
-| North Carolina | US-NC |
-| North Dakota | US-ND |
-| Ohio | US-OH |
-| Oklahoma | US-OK |
-| Oregon | US-OR |
-| Pennsylvania | US-PA |
-| Rhode Island | US-RI |
-| South Carolina | US-SC |
-| South Dakota | US-SD |
-| Tennessee | US-TN |
-| Texas | US-TX |
-| Utah | US-UT |
-| Vermont | US-VT |
-| Virginia | US-VA |
-| Washington | US-WA |
-| West Virginia | US-WV |
-| Wisconsin | US-WI |
-| Wyoming | US-WY |
-| District of Columbia | US-DC |
-| American Samoa | US-AS |
-| Guam | US-GU |
-| Northern Mariana Islands | US-MP |
-| Puerto Rico | US-PR |
-| United States Minor Outlying Islands | US-UM |
-| U.S. Virgin Islands | US-VI |
-
-## Canadian provinces and territories
-
-| Province and territory name | Code |
-|-|--|
-| Alberta | CA-AB |
-| British Columbia | CA-BC |
-| Manitoba | CA-MB |
-| New Brunswick | CA-NB |
-| Newfoundland and Labrador | CA-NL |
-| Nova Scotia | CA-NS |
-| Ontario | CA-ON |
-| Prince Edward Island | CA-PE |
-| Quebec | CA-QC |
-| Saskatchewan | CA-SK |
-| Northwest Territories | CA-NT |
-| Nunavut | CA-NU |
-| Yukon | CA-YT |
--
-## Australian states and territories
-
-| State and territory name | Code |
-|-|--|
-| New South Wales | AU-NSW |
-| Queensland | AU-QLD |
-| South Australia | AU-SA |
-| Tasmania | AU-TAS |
-| Victoria | AU-VIC |
-| Western Australia | AU-WA |
-| Australian Capital Territory | AU-ACT |
-| Northern Territory | AU-NT |
--
-## Next steps
--- Learn about the [co-sell option in the commercial marketplace](./co-sell-configure.md).
marketplace Create Managed Service Offer Plans https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/create-managed-service-offer-plans.md
After your offer is published, you can [publish an updated version of your offer
## Next steps -- Exit plan setup and continue with optional [Co-sell with Microsoft](./co-sell-overview.md), or
+- Exit plan setup and continue with optional [Co-sell with Microsoft](/partner-center/co-sell-overview?context=/azure/marketplace/context/context), or
- [Review and publish your offer](review-publish-offer.md)
marketplace Create New Saas Offer Marketing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/create-new-saas-offer-marketing.md
This article describes additional options you can choose if youΓÇÖre selling you
Providing information on the **Co-sell with Microsoft** page is entirely optional. But it's required to achieve _Co-sell Ready_ and _IP Co-sell Ready_ status. The Microsoft sales teams use this information to learn more about your solution when evaluating its fit for customer needs. The information you provide on this tab isn't available directly to customers.
-For details and instructions to configure the **Co-sell with Microsoft** tab, see [Co-sell option in the commercial marketplace](./co-sell-configure.md).
+For details and instructions to configure the **Co-sell with Microsoft** tab, see [Co-sell option in the commercial marketplace](/partner-center/co-sell-configure?context=/azure/marketplace/context/context).
## Resell through CSPs
marketplace Dynamics 365 Business Central Technical Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/dynamics-365-business-central-technical-configuration.md
Required if your offer must be installed along with another extension that will
>[!NOTE] >The dependency package file is no longer used. Upload a library extension package file instead.
-Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell partner engagement](./co-sell-overview.md). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
+Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell partner engagement](/partner-center/co-sell-overview?context=/azure/marketplace/context/context). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
## Next steps
marketplace Dynamics 365 Customer Engage Technical Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/dynamics-365-customer-engage-technical-configuration.md
Select **+ Add region** to specify the geographic regions in which your CRM pack
By default, the **Application configuration URL** you entered above will be used for each region. Leave the Application Configuration URL field blank.
-Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell with Microsoft sales teams and partners overview](./co-sell-overview.md). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
+Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell with Microsoft sales teams and partners overview](/partner-center/co-sell-overview?context=/azure/marketplace/context/context). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
## Next steps -- [Configure supplemental content](dynamics-365-customer-engage-supplemental-content.md)
+- [Configure supplemental content](dynamics-365-customer-engage-supplemental-content.md)
marketplace Dynamics 365 Operations Technical Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/dynamics-365-operations-technical-configuration.md
Provide the solution identifier (GUID) for your solution. To find your solution
Select the version of Dynamics 365 Operations Apps this solution works with.
-Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell partner engagement](./co-sell-overview.md). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
+Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell partner engagement](/partner-center/co-sell-overview?context=/azure/marketplace/context/context). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
## Test drive technical configuration
marketplace Iot Edge Plan Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/iot-edge-plan-overview.md
Select **Create** and continue below.
## Next steps - [+ Create new plan](iot-edge-plan-setup.md), or-- Exit plan setup and continue with optional [Co-sell with Microsoft](./co-sell-overview.md), or-- [Review and publish your offer](review-publish-offer.md)
+- Exit plan setup and continue with optional [Co-sell with Microsoft](/partner-center/co-sell-overview?context=/azure/marketplace/context/context), or
+- [Review and publish your offer](review-publish-offer.md)
marketplace Iot Edge Plan Technical Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/iot-edge-plan-technical-configuration.md
Select **Save draft**, then **← Plan overview** in the left-nav menu to return
## Next steps -- To **Co-sell with Microsoft** (optional), select it in the left-nav menu. For details, see [Co-sell partner engagement](./co-sell-overview.md).
+- To **Co-sell with Microsoft** (optional), select it in the left-nav menu. For details, see [Co-sell partner engagement](/partner-center/co-sell-overview?context=/azure/marketplace/context/context).
- If you're not setting up co-sell or you've finished, it's time to [Review and publish your offer](review-publish-offer.md).
marketplace Marketplace Containers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/marketplace-containers.md
Containers support two licensing models: Free or Bring Your Own License (BYOL).
You can choose to opt into Microsoft-supported marketing and sales channels. When creating your offer in Partner Center, you will see two tabs toward the end of the process: - **Resell through CSPs** ΓÇô Allow Microsoft Cloud Solution Providers (CSP) partners to resell your solution as part of a bundled offer. For more information about this program, see [Cloud Solution Provider program](cloud-solution-providers.md).-- **Co-sell with Microsoft** ΓÇô Let Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. For details about co-sell eligibility, see [Requirements for co-sell status](/legal/marketplace/certification-policies). For details on preparing your offer for evaluation, see [Co-sell option in Partner Center](./co-sell-configure.md).
+- **Co-sell with Microsoft** ΓÇô Let Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. For details about co-sell eligibility, see [Requirements for co-sell status](/legal/marketplace/certification-policies). For details on preparing your offer for evaluation, see [Co-sell option in Partner Center](/partner-center/co-sell-configure?context=/azure/marketplace/context/context).
## Container offer requirements
marketplace Marketplace Dynamics 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/marketplace-dynamics-365.md
To help create your offer more easily, prepare these items ahead of time. All ar
## Additional sales opportunities
-You can choose to opt into Microsoft-supported marketing and sales channels. When creating your offer in Partner Center, you will see two a tab toward the end of the process for **Co-sell with Microsoft**. This option lets Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. See [Co-sell option in Partner Center](./co-sell-configure.md) for detailed information on how to prepare your offer for evaluation.
+You can choose to opt into Microsoft-supported marketing and sales channels. When creating your offer in Partner Center, you will see two a tab toward the end of the process for **Co-sell with Microsoft**. This option lets Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. See [Co-sell option in Partner Center](/partner-center/co-sell-configure?context=/azure/marketplace/context/context) for detailed information on how to prepare your offer for evaluation.
## Next steps
marketplace Marketplace Power Bi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/marketplace-power-bi.md
To help create your offer more easily, prepare these items ahead of time. All ar
You can choose to opt into Microsoft-supported marketing and sales channels. When creating your offer in Partner Center, you will see two tabs toward the end of the process: -- **Co-sell with Microsoft** ΓÇô Let Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. For details about co-sell eligibility, see [Requirements for co-sell status](/legal/marketplace/certification-policies). For details on preparing your offer for evaluation, see [Co-sell option in Partner Center](./co-sell-configure.md).
+- **Co-sell with Microsoft** ΓÇô Let Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. For details about co-sell eligibility, see [Requirements for co-sell status](/legal/marketplace/certification-policies). For details on preparing your offer for evaluation, see [Co-sell option in Partner Center](/partner-center/co-sell-configure?context=/azure/marketplace/context/context).
## Next steps
marketplace Orders Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/orders-dashboard.md
The Order details table displays a numbered list of the 1,000 top orders sorted
| MonthStartDate | Month Start Date | Month Start Date represents month of Purchase. The format is yyyy-mm-dd. | MonthStartDate | | Offer Type | Offer Type | The type of commercial marketplace offering. | OfferType | | Azure License Type | Azure License Type | The type of licensing agreement used by customers to purchase Azure. Also known as Channel. The possible values are:<ul><li>[Cloud Solution Provider](cloud-solution-providers.md)</li><li>Enterprise</li><li>Enterprise through Reseller</li><li>Pay as You Go</li><li>GTM</li></ul> | AzureLicenseType |
-| Marketplace License Type | Marketplace License Type | The billing method of the commercial marketplace offer. The different values are:<ul><li>[Cloud Solution Provider](cloud-solution-providers.md) (CSP)</li><li>Enterprise (EA)</li><li>Enterprise through reseller</li><li>Pay as You Go</li><li>[Go to market](co-sell-overview.md) (GTM)</li></ul> | MarketplaceLicenseType |
+| Marketplace License Type | Marketplace License Type | The billing method of the commercial marketplace offer. The different values are:<ul><li>[Cloud Solution Provider](cloud-solution-providers.md) (CSP)</li><li>Enterprise (EA)</li><li>Enterprise through reseller</li><li>Pay as You Go</li><li>[Go to market](/partner-center/co-sell-overview?context=/azure/marketplace/context/context) (GTM)</li></ul> | MarketplaceLicenseType |
| SKU | SKU | The plan associated with the offer | SKU | | Customer Country | Customer Country/Region | The country/region name provided by the customer. Country/region could be different than the country/region in a customer's Azure subscription. | CustomerCountry | | Is Preview SKU | Is Preview SKU | The value will let you know if you have tagged the SKU as "preview". Value will be "Yes" if the SKU has been tagged accordingly, and only Azure subscriptions authorized by you can deploy and use this image. Value will be "No" if the SKU has not been identified as "preview". | IsPreviewSKU |
marketplace Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/overview.md
Partners who list with the commercial marketplace are eligible for a diverse set
- Get the technical resources you need to get your application ready for launch, from technical support, application design, and architecture design, to Azure credits for development and testing. - Access free Microsoft Go-To-Market Launch Fundamentals to help you launch and promote your solution. You might also be eligible for Microsoft marketing campaigns and opportunities to be featured in the commercial marketplace.-- Reach more customers and expand your sales opportunities with the [Cloud Solution Provider](https://partner.microsoft.com/cloud-solution-provider) (CSP) program, the [co-sell](./co-sell-overview.md) program, and Microsoft Sales teams.
+- Reach more customers and expand your sales opportunities with the [Cloud Solution Provider](https://partner.microsoft.com/cloud-solution-provider) (CSP) program, the [co-sell](/partner-center/co-sell-overview?context=/azure/marketplace/context/context) program, and Microsoft Sales teams.
To learn about these benefits in more detail, see [Your commercial marketplace benefits](gtm-your-marketplace-benefits.md).
marketplace Plan Azure Application Offer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/plan-azure-application-offer.md
This configuration is required if you want to use [Batch usage event](marketplac
You can choose to opt into Microsoft-supported marketing and sales channels. When creating your offer in Partner Center, you will see two tabs toward the end of the process: - **Resell through CSPs**: Use this option to allow Microsoft Cloud Solution Providers (CSP) partners to resell your solution as part of a bundled offer. See [Cloud Solution Provider program](./cloud-solution-providers.md) for more information.-- **Co-sell with Microsoft**: This option lets Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customers' needs. For detailed information on how to prepare your offer for evaluation, see [Co-sell option in the commercial marketplace](./co-sell-configure.md). For details about IP co-sell requirements, see [Requirements for co-sell status](/legal/marketplace/certification-policies#3000-requirements-for-co-sell-status). For more information about marketing your offer through the Microsoft CSP partner channels, see [Cloud Solution Providers](cloud-solution-providers.md).
+- **Co-sell with Microsoft**: This option lets Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customers' needs. For detailed information on how to prepare your offer for evaluation, see [Co-sell option in the commercial marketplace](/partner-center/co-sell-configure?context=/azure/marketplace/context/context). For details about IP co-sell requirements, see [Requirements for co-sell status](/legal/marketplace/certification-policies#3000-requirements-for-co-sell-status). For more information about marketing your offer through the Microsoft CSP partner channels, see [Cloud Solution Providers](cloud-solution-providers.md).
To learn more, see [Grow your cloud business with Azure Marketplace](https://azuremarketplace.microsoft.com/sell).
marketplace Plan Saas Offer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/plan-saas-offer.md
You can choose to opt into Microsoft-supported marketing and sales channels. Whe
- **Resell through CSPs**: Use this option to allow Microsoft Cloud Solution Providers (CSP) partners to resell your solution as part of a bundled offer. For more information about this program, see [Cloud Solution Provider program](cloud-solution-providers.md). -- **Co-sell with Microsoft**: This option lets Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. For details about co-sell eligibility, see [Requirements for co-sell status](/legal/marketplace/certification-policies#3000-requirements-for-co-sell-status). For detailed information on how to prepare your offer for evaluation, see [Co-sell option in Partner Center](co-sell-configure.md).
+- **Co-sell with Microsoft**: This option lets Microsoft sales teams consider your IP co-sell eligible solution when evaluating their customersΓÇÖ needs. For details about co-sell eligibility, see [Requirements for co-sell status](/legal/marketplace/certification-policies#3000-requirements-for-co-sell-status). For detailed information on how to prepare your offer for evaluation, see [Co-sell option in Partner Center](/partner-center/co-sell-configure?context=/azure/marketplace/context/context).
## Next steps
marketplace Power Bi App Technical Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/power-bi-app-technical-configuration.md
Promote your app in Power BI app to production by providing the Power BI **App i
For more information, see [Publish apps with dashboards and reports in Power BI](/power-bi/service-create-distribute-apps).
-Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell partner engagement](./co-sell-overview.md). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
+Select **Save draft** before continuing to the next tab in the left-nav menu, **Co-sell with Microsoft**. For information on setting up co-sell with Microsoft (optional), see [Co-sell partner engagement](/partner-center/co-sell-overview?context=/azure/marketplace/context/context). If you're not setting up co-sell or you've finished, continue with **Next steps** below.
## Next steps -- Configure [supplemental content](power-bi-app-supplemental-content.md)
+- Configure [supplemental content](power-bi-app-supplemental-content.md)
marketplace Preferred Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/preferred-solution.md
Last updated 10/26/2021
## What is the Microsoft preferred solution badge?
-The preferred solution badge is awarded as a [commercial marketplace benefit](./co-sell-overview.md) to:
+The preferred solution badge is awarded as a [commercial marketplace benefit](/partner-center/co-sell-overview?context=/azure/marketplace/context/context) to:
- Offers published to the commercial marketplace with an Azure IP co-sell incentive - Offers enrolled in the Microsoft Business Applications ISV Connect program with co-sell ready status
Until July 2021, publishers with at least one co-sell ready offer were eligible
## Next steps -- To configure an offer for co-sell, see [Configure Co-sell for a commercial marketplace offer](./co-sell-configure.md)-- For information about co-sell incentive status, see [Requirements for Azure IP Co-sell incentive status](./co-sell-requirements.md) or [Business Applications Co-sell incentive status](./co-sell-requirements.md)
+- To configure an offer for co-sell, see [Configure Co-sell for a commercial marketplace offer](/partner-center/co-sell-configure?context=/azure/marketplace/context/context)
+- For information about co-sell incentive status, see [Requirements for Azure IP Co-sell incentive status](/partner-center/co-sell-requirements?context=/azure/marketplace/context/context) or [Business Applications Co-sell incentive status](/partner-center/co-sell-requirements?context=/azure/marketplace/context/context)
marketplace Reference Architecture Diagram https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/reference-architecture-diagram.md
- Title: Reference architecture diagram | Azure Marketplace
-description: How to create a reference architecture diagram for an offer in the Microsoft commercial marketplace.
------ Previously updated : 09/22/2021--
-# Reference architecture diagram
-
-The reference architecture diagram is a model that represents the infrastructure your offer relies on. For Azure IP solutions, the diagram should also show how your offer uses MicrosoftΓÇÖs cloud services per the technical requirements of IP Co-sell. It is not designed to assess the quality of the architecture. It is designed to show how your solution uses Microsoft services.
-
-The reference architecture diagram can be created via multiple tools. We recommend Microsoft Visio, as it has multiple stencils that depict Azure architecture models.
-
-A helpful starting point for building reference architecture diagrams is to leverage the [Azure Architecture models](/azure/architecture/browse/).
-
-## Typical components of a reference architecture diagram
-
-The diagram must clearly identify your IP as solution, application, or service code both deployed on and driving consumption of Microsoft Azure. This code must be highly reusable and not depend on extensive customization per deployment.
--- Cloud services that host and interact with your offer, including ones that consume Azure resources-- Data connections, data layers, and data services being consumed by your offer-- Cloud services used to control security, authentication, and users of the offer-- User interfaces and other services that expose the offer to users-- Hybrid or on-premises connectivity and integrations or a combination of both-
-This screenshot shows an example of a reference architecture diagram.
-
-[![This image is an example Co-sell architecture diagram.](./media/co-sell/co-sell-arch-diagram.png)](./media/co-sell/co-sell-arch-diagram.png#lightbox)
-
-This example reference architecture diagram is for a vertical industry chatbot that can be integrated with intranet sites to help with forecast demand scenarios via a machine learning algorithm. This solution uses supply chain and manufacturing schedule data from different ERP systems. The bot is designed to address questions about when a salesperson can commit on possible delivery dates for an order.
-
-## Next steps
--- [Configure Co-sell for a commercial marketplace offer](./co-sell-configure.md)
media-services Migrate V 2 V 3 Differences Feature Gaps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/media-services/latest/migrate-v-2-v-3-differences-feature-gaps.md
multiple Previously updated : 03/25/2021 Last updated : 01/31/2022
![migration steps 2](./media/migration-guide/steps-2.svg)
-This part of the migration guidance gives you detailed information about the
-differences between the V2 and V3 APIs.
+This part of the migration guidance gives you detailed information about the differences between the V2 and V3 APIs.
## Feature gaps between V2 and V3 APIs The V3 API has the following feature gaps with the V2 API. A couple
-of the advanced features of the Media Encoder Standard in V2 APIs are currently
-not available in V3:
+of the advanced features of the Media Encoder Standard in V2 APIs are currently not available in V3:
-- Inserting a silent audio track when input has no audio, as this is no longer required with the Azure Media Player.
+- Inserting a silent audio track when input has no audio or inserting a monochrome video track when input has no video, as this is no longer required with the Azure Media Player.
- Inserting a video track when input has no video.
+- The `InsertBlackIfNoVideoBottomLayerOnly` and `InsertBlackIfNoVideo` flags are no longer supported in v3.
+ - Live Events with transcoding currently don't support Slate insertion mid-stream and ad marker insertion via API call. - Azure Media Premium Encoder will no longer be supported in V2. If you're using it for 8-bit HEVC encoding, use the new HEVC support in the Standard Encoder.
not available in V3:
- In Media Services V3, FairPlay IV cannot be specified. While it doesn't impact customers using Media Services for both packaging and license delivery, it can be an issue when using a third-party DRM system to deliver the FairPlay licenses (hybrid mode). -- Client-side storage encryption for protection of assets at rest has been removed in the V3 API and replaced by storage service encryption for data at rest. The V3 APIs continue to work with existing storage encrypted assets but won't allow creation of new ones.
+- Client-side storage encryption for protection of assets at rest has been removed in the V3 API and replaced by storage service encryption for data at rest. The V3 APIs continue to work with existing storage-encrypted assets but won't allow creation of new ones.
## Terminology and entity changes
migrate Migrate Support Matrix Hyper V Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/migrate/migrate-support-matrix-hyper-v-migration.md
You can select up to 10 VMs at once for replication. If you want to migrate more
| :- | :- | | **Deployment** | The Hyper-V host can be standalone or deployed in a cluster. <br/>Azure Migrate replication software (Hyper-V Replication provider) is installed on the Hyper-V hosts.| | **Permissions** | You need administrator permissions on the Hyper-V host. |
-| **Host operating system** | Windows Server 2019, Windows Server 2016, or Windows Server 2012 R2 with latest updates. Note that Server core installation of these operating systems is also supported. |
+| **Host operating system** | Windows Server 2022, Windows Server 2019, Windows Server 2016, or Windows Server 2012 R2 with latest updates. Note that Server core installation of these operating systems is also supported. |
| **Other Software requirements** | .NET Framework 4.7 or later | | **Port access** | Outbound connections on HTTPS port 443 to send VM replication data.
network-watcher Network Watcher Nsg Flow Logging Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/network-watcher/network-watcher-nsg-flow-logging-overview.md
Also, when a NSG is deleted, by default the associated flow log resource is dele
## Best practices
-**Enable on critical VNETs/Subnets**: Flow Logs should be enabled on all critical VNETs/subnets in your subscription as an auditability and security best practice.
+**Enable on critical subnets**: Flow Logs should be enabled on all critical subnets in your subscription as an auditability and security best practice.
**Enable NSG Flow Logging on all NSGs attached to a resource**: Flow logging in Azure is configured on the NSG resource. A flow will only be associated to one NSG Rule. In scenarios where multiple NSGs are utilized, we recommend enabling NSG flow logs on all NSGs applied at the resource's subnet or network interface to ensure that all traffic is recorded. For more information, see [how traffic is evaluated](../virtual-network/network-security-group-how-it-works.md) in Network Security Groups. Few common scenarios: 1. **Multiple NICs at a VM**: In case multiple NICs are attached to a virtual machine, flow logging must be enabled on all of them
-1. **Having NSG at both NIC and Subnet Level**: In case NSG is configured at the NIC as well as the Subnet level, then flow logging must be enabled at both the NSGs.
+1. **Having NSG at both NIC and Subnet Level**: In case NSG is configured at the NIC as well as the subnet level, then flow logging must be enabled at both the NSGs since the exact sequence of rule processing by NSGs at NIC and subnet level is platform dependent and varies from case to case. Traffic flows will be logged against the NSG which is processed last.
1. **AKS Cluster Subnet**: AKS adds a default NSG at the cluster subnet. As explained in the above point, flow logging must be enabled on this default NSG. **Storage provisioning**: Storage should be provisioned in tune with expected Flow Log volume.
postgresql How To Upgrade Using Dump And Restore https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/postgresql/how-to-upgrade-using-dump-and-restore.md
Last updated 11/30/2021
# Upgrade your PostgreSQL database using dump and restore >[!NOTE]
-> The concepts explained in this documentation is applicable to both Azure Database for PostgreSQL - Single Server and Azure Database for PostgreSQL - Flexible Server.
+> The concepts explained in this documentation are applicable to both Azure Database for PostgreSQL - Single Server and Azure Database for PostgreSQL - Flexible Server.
You can upgrade your PostgreSQL server deployed in Azure Database for PostgreSQL by migrating your databases to a higher major version server using following methods. * **Offline** method using PostgreSQL [pg_dump](https://www.postgresql.org/docs/current/static/app-pgdump.html) and [pg_restore](https://www.postgresql.org/docs/current/static/app-pgrestore.html) which incurs downtime for migrating the data. This document addresses this method of upgrade/migration.
-* **Online** method using [Database Migration Service](../dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md) (DMS). This method provides a reduced downtime migration and keeps the target database in-sync with with the source and you can choose when to cut-over. However, there are few prerequisites and restrictions to be addressed for using DMS. For details, see the [DMS documentation](../dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md).
+* **Online** method using [Database Migration Service](../dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md) (DMS). This method provides a reduced downtime migration and keeps the target database in-sync with the source and you can choose when to cut-over. However, there are few prerequisites and restrictions to be addressed for using DMS. For details, see the [DMS documentation](../dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md).
The following table provides some recommendations based on database sizes and scenarios.
To step through this how-to-guide, you need:
- A **source** PostgreSQL database server running a lower version of the engine that you want to upgrade. - A **target** PostgreSQL database server with the desired major version [Azure Database for PostgreSQL server - Single Server](quickstart-create-server-database-portal.md) or [Azure Database for PostgreSQL - Flexible Server](./flexible-server/quickstart-create-server-portal.md). - A PostgreSQL client system to run the dump and restore commands. It is recommended to use the higher database version. For example, if you are upgrading from PostgreSQL version 9.6 to 11, please use PostgreSQL version 11 client.
- - It can be a Linux or Windows client with PostgreSQL installed and has [pg_dump](https://www.postgresql.org/docs/current/static/app-pgdump.html) and [pg_restore](https://www.postgresql.org/docs/current/static/app-pgrestore.html) command-line utilities installed.
+ - It can be a Linux or Windows client that has PostgreSQL installed and that has the [pg_dump](https://www.postgresql.org/docs/current/static/app-pgdump.html) and [pg_restore](https://www.postgresql.org/docs/current/static/app-pgrestore.html) command-line utilities installed.
- Alternatively, you can use [Azure Cloud Shell](https://shell.azure.com) or by clicking the Azure Cloud Shell on the menu bar at the upper right in the [Azure portal](https://portal.azure.com). You will have to login to your account `az login` before running the dump and restore commands. - Your PostgreSQL client preferably running in the same region as the source and target servers.
If you do not have a PostgreSQL client or you want to use Azure Cloud Shell, the
3. Once the upgrade (migration) process completes, you can test your application with the target server. 4. Repeat this process for all the databases within the server.
- As an example, the following table illustrates time it took to migrate using streaming dump method. The sample data is populated using [pgbench](https://www.postgresql.org/docs/10/pgbench.html). As your database can have different number of objects with varied sizes than pgbench generated tables and indexes, it is highly recommended to test dump and restore of your database to understand the actual time it takes to upgrade your database.
+ As an example, the following table illustrates the time it took to migrate using streaming dump method. The sample data is populated using [pgbench](https://www.postgresql.org/docs/10/pgbench.html). As your database can have different number of objects with varied sizes than pgbench generated tables and indexes, it is highly recommended to test dump and restore of your database to understand the actual time it takes to upgrade your database.
| **Database Size** | **Approx. time taken** | | -- | |
purview How To Data Owner Policy Authoring Generic https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/how-to-data-owner-policy-authoring-generic.md
This section describes the steps to create a new policy in Azure Purview.
1. Sign in to Azure Purview Studio.
-1. Navigate to the **Policy management** app using the left side panel. Then select **Data policies**.
+1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
1. Select the **New Policy** button in the policy page.
Steps to create a new policy in Azure Purview are as follows.
1. Sign in to Azure Purview Studio.
-1. Navigate to the **Policy management** app using the left side panel. Then select **Data policies**.
+1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
![Image shows how a data owner can access the Policy functionality in Azure Purview when it wants to update a policy.](./media/access-policies-common/policy-onboard-guide-2.png)
The steps to publish a policy are as follows
1. Sign in to Azure Purview Studio.
-1. Navigate to the Policy management app using the left side panel. Then select **Data policies**.
+1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
![Image shows how a data owner can access the Policy functionality in Azure Purview when it wants to publish a policy.](./media/access-policies-common/policy-onboard-guide-2.png)
purview Register Scan Adls Gen2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-adls-gen2.md
It is important to give your service principal the permission to scan the ADLS G
[!INCLUDE [view and manage scans](includes/view-and-manage-scans.md)] ## Access policy+
+Access policies allow data owners to manage access to datasets from Azure Purview. Owners can monitor and manage data use from within the Azure Purview Studio, without directly modifying the storage account where the data is housed.
++
+To create an access policy for Azure Data Lake Storage Gen 2, follow the guidelines below.
+ [!INCLUDE [Azure Storage specific pre-requisites](./includes/access-policies-prerequisites-storage.md)]
-Follow these configuration guides to:
-- [Configure from Azure Purview data owner access policies on an Azure Storage account](./tutorial-data-owner-policies-storage.md)-- [Configure from Azure Purview data owner access policies on all data sources in a subscription or a resource group](./tutorial-data-owner-policies-resource-group.md)
+### Create an access policy
+
+Now that youΓÇÖve prepared your storage account and environment for access policies, you can follow one of these configuration guides to create your policies:
+- [Single storage account](./tutorial-data-owner-policies-storage.md) - This guide will allow you to enable access policies on a single storage account in your subscription.
+- [All sources in a subscription or resource group](./tutorial-data-owner-policies-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
## Next steps
purview Register Scan Amazon Rds https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-amazon-rds.md
The Multi-Cloud Scanning Connector for Azure Purview allows you to explore your organizational data across cloud providers, including Amazon Web Services, in addition to Azure storage services. + This article describes how to use Azure Purview to scan your structured data currently stored in Amazon RDS, including both Microsoft SQL and PostgreSQL databases, and discover what types of sensitive information exists in your data. You'll also learn how to identify the Amazon RDS databases where the data is currently stored for easy information protection and data compliance. For this service, use Azure Purview to provide a Microsoft account with secure access to AWS, where the Multi-Cloud Scanning Connectors for Azure Purview will run. The Multi-Cloud Scanning Connectors for Azure Purview use this access to your Amazon RDS databases to read your data, and then reports the scanning results, including only the metadata and classification, back to Azure. Use the Azure Purview classification and labeling reports to analyze and review your data scan results.
For this service, use Azure Purview to provide a Microsoft account with secure a
> The Multi-Cloud Scanning Connectors for Azure Purview are separate add-ons to Azure Purview. The terms and conditions for the Multi-Cloud Scanning Connectors for Azure Purview are contained in the agreement under which you obtained Microsoft Azure Services. For more information, see Microsoft Azure Legal Information at https://azure.microsoft.com/support/legal/. >
-> [!IMPORTANT]
-> Azure Purview support for Amazon RDS is currently in PREVIEW. The [Azure Preview Supplemental Terms](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
->
- ## Azure Purview scope for Amazon RDS - **Supported database engines**: Amazon RDS structured data storage supports multiple database engines. Azure Purview supports Amazon RDS with/based on Microsoft SQL and PostgreSQL.
purview Register Scan Azure Blob Storage Source https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-azure-blob-storage-source.md
Scans can be managed or run again on completion
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-full-inc-scan.png" alt-text="full or incremental scan"::: ## Access policy+
+Access policies allow data owners to manage access to datasets from Azure Purview. Owners can monitor and manage data use from within the Azure Purview Studio, without directly modifying the storage account where the data is housed.
++
+To create an access policy for an Azure Storage account, follow the guidelines below.
+ [!INCLUDE [Azure Storage specific pre-requisites](./includes/access-policies-prerequisites-storage.md)]
-Follow these configuration guides to:
-- [Configure from Azure Purview data owner access policies on an Azure Storage account](./tutorial-data-owner-policies-storage.md)-- [Configure from Azure Purview data owner access policies on all data sources in a subscription or a resource group](./tutorial-data-owner-policies-resource-group.md)
+### Create an access policy
+
+Now that youΓÇÖve prepared your storage account and environment for access policies, you can follow one of these configuration guides to create your policies:
+- [Single storage account](./tutorial-data-owner-policies-storage.md) - This guide will allow you to enable access policies on a single storage account in your subscription.
+- [All sources in a subscription or resource group](./tutorial-data-owner-policies-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
## Next steps
purview Register Scan Db2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-db2.md
This article outlines how to register Db2, and how to authenticate and interact with Db2 in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> Db2 as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Erwin Source https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-erwin-source.md
This article outlines how to register erwin Mart servers, and how to authenticate and interact with erwin Mart Servers in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> erwin Mart server as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Google Bigquery Source https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-google-bigquery-source.md
This article outlines how to register Google BigQuery projects, and how to authenticate and interact with Google BigQuery in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> Google BigQuery as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Looker Source https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-looker-source.md
This article outlines how to register Looker, and how to authenticate and interact with Looker in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> Looker as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Mysql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-mysql.md
This article outlines how to register MySQL, and how to authenticate and interact with MySQL in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> MySQL as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-postgresql.md
This article outlines how to register PostgreSQL, and how to authenticate and interact with PostgreSQL in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> PostgreSQL as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Salesforce https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-salesforce.md
This article outlines how to register Salesforce, and how to authenticate and interact with Salesforce in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> Salesforce as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Sap Hana https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-sap-hana.md
This article outlines how to register SAP HANA, and how to authenticate and interact with SAP HANA in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> SAP HANA as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Snowflake https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-snowflake.md
This article outlines how to register Snowflake, and how to authenticate and interact with Snowflake in Azure Purview. For more information about Azure Purview, read the [introductory article](overview.md).
-> [!IMPORTANT]
-> Snowflake as a source is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Supported capabilities
purview Register Scan Synapse Workspace https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/purview/register-scan-synapse-workspace.md
This section describes how to register Azure Synapse Analytics workspaces in Azu
### Authentication for registration
-Only users with at least a *Reader* role on the Azure Synapse workspace who is also *data source administrators* in Azure Purview can register an Azure Synapse workspace.
+Only a user with at least a *Reader* role on the Azure Synapse workspace and who is also *data source administrators* in Azure Purview can register an Azure Synapse workspace.
### Steps to register
Then, you will need to [apply permissions to scan the contents of the workspace]
### Authentication for enumerating serverless SQL database resources
-There are three places you will need to set authentication to allow Azure Purview to enumerate your serverless SQL database resources: the Synapse workspace, the associated storage, and on the Serverless databases. The steps below will set permissions for all three.
+There are three places you will need to set authentication to allow Azure Purview to enumerate your serverless SQL database resources: The Azure Synapse workspace, the associated storage, and the Azure Synapse serverless databases. The steps below will set permissions for all three.
+
+#### Azure Synapse workspace
1. In the Azure portal, go to the Azure Synapse workspace resource. 1. On the left pane, selectΓÇ»**Access Control (IAM)**.
There are three places you will need to set authentication to allow Azure Purvie
1. Select the **Add** button. 1. Set the **Reader** role and enter your Azure Purview account name, which represents its managed service identity (MSI). 1. Select **Save** to finish assigning the role.
-1. In the Azure portal, go to the **Resource group** or **Subscription** that the Azure Synapse workspace is in.
+
+#### Storage account
+
+1. In the Azure portal, go to the **Resource group** or **Subscription** that the storage account associated with the Azure Synapse workspace is in.
1. On the left pane, selectΓÇ»**Access Control (IAM)**. > [!NOTE] > You must be an *owner* or *user access administrator* to add a role in the **Resource group** or **Subscription** fields. - 1. Select the **Add** button. 1. Set the **Storage blob data reader** role and enter your Azure Purview account name (which represents its MSI) in the **Select** box. 1. Select **Save** to finish assigning the role.+
+#### Azure Synapse serverless database
+ 1. Go to your Azure Synapse workspace and open the Synapse Studio. 1. Select the **Data** tab on the left menu. 1. Select the ellipsis (**...**) next to one of your databases, and then start a new SQL script.
role-based-access-control Built In Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/role-based-access-control/built-in-roles.md
The following table provides a brief description of each built-in role. Click th
> | **Compute** | | | > | [Classic Virtual Machine Contributor](#classic-virtual-machine-contributor) | Lets you manage classic virtual machines, but not access to them, and not the virtual network or storage account they're connected to. | d73bb868-a0df-4d4d-bd69-98a00b01fccb | > | [Virtual Machine Administrator Login](#virtual-machine-administrator-login) | View Virtual Machines in the portal and login as administrator | 1c0163c0-47e6-4577-8991-ea5c82e286e4 |
-> | [Virtual Machine Contributor](#virtual-machine-contributor) | Create and manage virtual machines, manage disks and disk snapshots, install and run software, reset password of the root user of the virtual machine using VM extensions, and manage local user accounts using VM extensions. This role does not grant you management access to the virtual network or storage account the virtual machines are connected to. This role does not allow you to assign roles in Azure RBAC. | 9980e02c-c2be-4d73-94e8-173b1dc7cf3c |
+> | [Virtual Machine Contributor](#virtual-machine-contributor) | Create and manage virtual machines, manage disks, install and run software, reset password of the root user of the virtual machine using VM extensions, and manage local user accounts using VM extensions. This role does not grant you management access to the virtual network or storage account the virtual machines are connected to. This role does not allow you to assign roles in Azure RBAC. | 9980e02c-c2be-4d73-94e8-173b1dc7cf3c |
> | [Virtual Machine User Login](#virtual-machine-user-login) | View Virtual Machines in the portal and login as a regular user. | fb879df8-f326-4884-b1cf-06f3ad86be52 | > | **Networking** | | | > | [CDN Endpoint Contributor](#cdn-endpoint-contributor) | Can manage CDN endpoints, but can't grant access to other users. | 426e0c7f-0c7e-4658-b36f-ff54d6c29b45 |
search Cognitive Search Concept Intro https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/cognitive-search-concept-intro.md
Previously updated : 01/14/2022 Last updated : 02/01/2022 # AI enrichment in Azure Cognitive Search
-In Azure Cognitive Search, AI enrichment refers to a pipeline process that adds machine learning to [indexer-based indexing](search-indexer-overview.md). Steps in the pipeline create new information where none previously existed: extracting information from images, detecting sentiment or key phrases from chunks of text, and recognizing entities, to name a few. All of these processes result in making previously unsearchable content available to full text search and knowledge mining scenarios.
+*AI enrichment* is the application of machine learning models over raw content, where analysis and inference are used to create searchable content and structure where none previously existed. Because Azure Cognitive Search is a full text search solution, the purpose of AI enrichment is to improve the utility of your content in search-related scenarios:
-[**Azure Blob Storage**](../storage/blobs/storage-blobs-overview.md)) is the most commonly used input, but any indexer-supported data source can provide the initial content. A [**skillset**](cognitive-search-working-with-skillsets.md), attached to an indexer, adds the AI processing. The indexer extracts content and sets up the pipeline, while AI processing identifies, analyzes, and creates new information out of blobs, images, and raw text. Output is a [**search index**](search-what-is-an-index.md) or optional [**knowledge store**](knowledge-store-concept-intro.md).
++ Machine translation and language detection support multi-lingual search++ Entity recognition finds people, places, and other entities in large chunks of text++ Key phrase extraction identifies and then aggregates important terms++ Optical Character Recognition (OCR) extracts text from binary files++ Image analysis tags and describes images in searchable text fields+
+AI enrichment is an extension of an [**indexer**](search-indexer-overview.md) pipeline.
+
+[**Blobs in Azure Storage**](../storage/blobs/storage-blobs-overview.md) are the most common data input, but any supported data source can provide the initial content. A [**skillset**](cognitive-search-working-with-skillsets.md), attached to an indexer, adds the AI processing. The indexer extracts content and sets up the pipeline. The skillset performs the enrichment steps. Output is always a [**search index**](search-what-is-an-index.md), and optionally a [**knowledge store**](knowledge-store-concept-intro.md).
![Enrichment pipeline diagram](./media/cognitive-search-intro/cogsearch-architecture.png "enrichment pipeline overview")
-Skillsets are composed of built-in skills from Cognitive Search or [*custom skills*](cognitive-search-create-custom-skill-example.md) for external processing that you provide. Custom skills might sound complex but can be simple and straightforward in terms of implementation. If you have existing packages that provide pattern matching or document classification models, the content you extract during indexing could be passed to these models for processing.
+Skillsets are composed of [*built-in skills*](cognitive-search-predefined-skills.md) from Cognitive Search or [*custom skills*](cognitive-search-create-custom-skill-example.md) for external processing that you provide. Custom skills arenΓÇÖt always complex. For example, if you have an existing package that provides pattern matching or a document classification model, you can wrap it in a custom skill.
Built-in skills fall into these categories:
-+ **Machine translation** is provided by the [text translation](cognitive-search-skill-text-translation.md) skill, often paired with [language detection](cognitive-search-skill-language-detection.md) for multi-language solutions.
++ **Machine translation** is provided by the [Text Translation](cognitive-search-skill-text-translation.md) skill, often paired with [language detection](cognitive-search-skill-language-detection.md) for multi-language solutions. + **Image processing** skills include [Optical Character Recognition (OCR)](cognitive-search-skill-ocr.md) and identification of [visual features](cognitive-search-skill-image-analysis.md), such as facial detection, image interpretation, image recognition (famous people and landmarks), or attributes like image orientation. These skills create text representations of image content for full text search in Azure Cognitive Search.
-+ **Natural language processing** skills include [entity recognition](cognitive-search-skill-entity-recognition-v3.md), [language detection](cognitive-search-skill-language-detection.md), [key phrase extraction](cognitive-search-skill-keyphrases.md), text manipulation, [sentiment detection (including opinion mining)](cognitive-search-skill-sentiment-v3.md), and [personal identifiable information detection](cognitive-search-skill-pii-detection.md). With these skills, unstructured text is mapped as searchable and filterable fields in an index.
-
-Built-in skills are based on pre-trained machine learning models in Cognitive Services APIs: [Computer Vision](../cognitive-services/computer-vision/index.yml) and [Language Service](../cognitive-services/language-service/overview.md). You should [attach a billable Cognitive Services resource](cognitive-search-attach-cognitive-services.md) if you want these resources for larger workloads.
++ **Natural language processing** skills include [Entity Recognition](cognitive-search-skill-entity-recognition-v3.md), [Language Detection](cognitive-search-skill-language-detection.md), [Key Phrase Extraction](cognitive-search-skill-keyphrases.md), text manipulation, [Sentiment Detection (including opinion mining)](cognitive-search-skill-sentiment-v3.md), and [Personal Identifiable Information Detection](cognitive-search-skill-pii-detection.md). With these skills, unstructured text is mapped as searchable and filterable fields in an index.
-Natural language and image processing is applied during the data ingestion phase, with results becoming part of a document's composition in a searchable index in Azure Cognitive Search. Data is sourced as an Azure data set and then pushed through an indexing pipeline using whichever [built-in skills](cognitive-search-predefined-skills.md) you need.
+Built-in skills are based on the Cognitive Services APIs: [Computer Vision](../cognitive-services/computer-vision/index.yml) and [Language Service](../cognitive-services/language-service/overview.md). Unless your content input is small, expect to [attach a billable Cognitive Services resource](cognitive-search-attach-cognitive-services.md) to run larger workloads.
-## Feature availability
+## Availability and pricing
-AI enrichment is available in regions where Azure Cognitive Services is also available. You can check the current availability of AI enrichment on the [Azure products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=search) page. AI enrichment is available in all supported regions except:
+AI enrichment is available in regions that have Azure Cognitive Services. You can check the availability of AI enrichment on the [Azure products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=search) page. AI enrichment is available in all regions except:
+ Australia Southeast + China North 2 + Germany West Central
-If your search service is located in one of these regions, you will not be able to create and use skillsets, but all other search service functionality is available and fully supported.
+Billing follows a pay-as-you-go pricing model. The costs of using built-in skills are passed on when a multi-region Cognitive Services key is specified in the skillset. There are also costs associated with image extraction, as metered by Cognitive Search. Text extraction and utility skills, however, aren't billable. For more information, see [How you're charged for Azure Cognitive Search](search-sku-manage-costs.md#how-youre-charged-for-azure-cognitive-search).
## When to use AI enrichment
-You should consider enrichment if your raw content is unstructured text, image content, or content that needs language detection and translation. Applying AI through the built-in cognitive skills can unlock this content for full text search and data science applications.
+Enrichment is useful if raw content is unstructured text, image content, or content that needs language detection and translation. Applying AI through the built-in cognitive skills can unlock this content for full text search and data science applications.
-Additionally, you might consider adding a custom skill if you have open-source, third-party, or first-party code that you'd like to integrate into the pipeline. Classification models that identify salient characteristics of various document types fall into this category, but any package that adds value to your content could be used.
+Enrichment also unlocks external processing. Open-source, third-party, or first-party code can be integrated into the pipeline as a custom skill. Classification models that identify salient characteristics of various document types fall into this category, but any external package that adds value to your content could be used.
### Use-cases for built-in skills A [skillset](cognitive-search-defining-skillset.md) that's assembled using built-in skills is well suited for the following application scenarios:
-+ [Optical Character Recognition (OCR)](cognitive-search-skill-ocr.md) that recognizes typeface and handwritten text in scanned documents (JPEG) is perhaps the most commonly used skill. Attaching the OCR skill will identify, extract, and ingest text from JPEG files.
++ [Optical Character Recognition (OCR)](cognitive-search-skill-ocr.md) that recognizes typeface and handwritten text in scanned documents (JPEG) is perhaps the most commonly used skill.
-+ [Text translation](cognitive-search-skill-text-translation.md) of multilingual content is another commonly used skill. Language detection is built into Text Translation, but you can also run [Language Detection](cognitive-search-skill-language-detection.md) independently if you just want the language codes of the content in your corpus.
++ [Text translation](cognitive-search-skill-text-translation.md) of multilingual content is another commonly used skill. Language detection is built into Text Translation, but you can also run [Language Detection](cognitive-search-skill-language-detection.md) as a separate skill to output a language code for each chunk of content.
-+ PDFs with combined image and text. Text in PDFs can be extracted during indexing without the use of enrichment steps, but the addition of image and natural language processing can often produce a better outcome than a standard indexing provides.
++ PDFs with combined image and text. Embedded text can be extracted without AI enrichment, but adding image and language skills can unlock more information than what could be obtained through standard text-based indexing. + Unstructured or semi-structured documents containing content that has inherent meaning or context that is hidden in the larger document.
- Blobs in particular often contain a large body of content that is packed into a single "field". By attaching image and natural language processing skills to an indexer, you can create new information that is extant in the raw content, but not otherwise surfaced as distinct fields. Some ready-to-use built-in cognitive skills that can help: [Key Phrase Extraction](cognitive-search-skill-keyphrases.md) and [Entity Recognition](cognitive-search-skill-entity-recognition-v3.md) (people, organizations, and locations to name a few).
+ Blobs in particular often contain a large body of content that is packed into a single "field". By attaching image and natural language processing skills to an indexer, you can create information that is extant in the raw content, but not otherwise surfaced as distinct fields.
+
+ Some ready-to-use built-in cognitive skills that can help: [Key Phrase Extraction](cognitive-search-skill-keyphrases.md) and [Entity Recognition](cognitive-search-skill-entity-recognition-v3.md) (people, organizations, and locations to name a few).
Additionally, built-in skills can also be used restructure content through text split, merge, and shape operations. ### Use-cases for custom skills
-Custom skills can support more complex scenarios, such as recognizing forms, or custom entity detection using a model that you provide and wrap in the [custom skill web interface](cognitive-search-custom-skill-interface.md). Several examples of custom skills include [Forms Recognizer](../applied-ai-services/form-recognizer/overview.md), integration of the [Bing Entity Search API](./cognitive-search-create-custom-skill-example.md), and [custom entity recognition](https://github.com/Microsoft/SkillsExtractorCognitiveSearch).
+Custom skills can support more complex scenarios, such as recognizing forms, or custom entity detection using a model that you provide and wrap in the [custom skill web interface](cognitive-search-custom-skill-interface.md). Several examples of custom skills include:
+++ [Forms Recognizer](../applied-ai-services/form-recognizer/overview.md)++ [Bing Entity Search API](./cognitive-search-create-custom-skill-example.md)++ [Custom entity recognition](https://github.com/Microsoft/SkillsExtractorCognitiveSearch) ## Enrichment steps <a name="enrichment-steps"></a>
This step assembles all of the initial or raw content that will undergo AI enric
### Step 2: Skillset enrichment phase
-A skillset defines the atomic operations that are performed on each document. For example, for text and images extracted from a PDF, a skillset might apply entity recognition, language detection, or key phrase extraction to produce new fields in your index that are not available natively in the source.
+A skillset defines the atomic operations that are performed on each document. For example, for text and images extracted from a PDF, a skillset might apply entity recognition, language detection, or key phrase extraction to produce new fields in your index that arenΓÇÖt available natively in the source.
![Enrichment phase](./media/cognitive-search-intro/enrichment-phase-blowup.png "enrichment phase")
Internally, the pipeline generates a collection of enriched documents. You can d
### Step 3: Indexing
-Indexing is the process wherein raw and enriched content is ingested as fields in a search index, and as [projections](knowledge-store-projection-overview.md) if you are also creating a knowledge store. The same enriched content can appear in both, using implicit or explicit field mappings to send the content to the correct fields.
+Indexing is the process wherein raw and enriched content is ingested as fields in a search index, and as [projections](knowledge-store-projection-overview.md) if you're also creating a knowledge store. The same enriched content can appear in both, using implicit or explicit field mappings to send the content to the correct fields.
Enriched content is generated during skillset execution, and is temporary unless you save it. In order for enriched content to appear in a search index, the indexer must have mapping information so that it can send enriched content to a field in a search index. [Output field mappings](cognitive-search-output-field-mapping.md) set up these associations.
-## Saving enriched output
+## Storing enriched output
In Azure Cognitive Search, an indexer saves the output it creates. A [**searchable index**](search-what-is-an-index.md) is one of the outputs that is always created by an indexer. Specification of an index is an indexer requirement, and when you attach a skillset, the output of the skillset, plus any fields that are mapped directly from the source, are used to populate the index. Usually, the outputs of specific skills, such as key phrases or sentiment scores, are ingested into the index in fields created for that purpose.
-A [**knowledge store**](knowledge-store-concept-intro.md) is an optional output, used for downstream apps like knowledge mining. A knowledge store is defined within a skillset. Its definition determines whether your enriched documents are projected as tables or objects (files or blobs). Tabular projections are well suited for interactive analysis in tools like Power BI, whereas files and blobs are typically used in data science or similar processes.
+A [**knowledge store**](knowledge-store-concept-intro.md) is an optional output, used for downstream apps like knowledge mining. A knowledge store is defined within a skillset. Its definition determines whether your enriched documents are projected as tables or objects (files or blobs). Tabular projections are recommended for interactive analysis in tools like Power BI. Files and blobs are typically used in data science or similar workloads.
Finally, an indexer can [**cache enriched documents**](cognitive-search-incremental-indexing-conceptual.md) in Azure Blob Storage for potential reuse in subsequent skillset executions. The cache is for internal use. Cached enrichments are consumable by the same skillset that you rerun at a later date. Caching is helpful if your skillset include image analysis or OCR, and you want to avoid the time and expense of reprocessing image files. Indexes and knowledge stores are fully independent of each other. While you must attach an index to satisfy indexer requirements, if your sole objective is a knowledge store, you can ignore the index after it's populated. Avoid deleting it though. If you want to rerun the indexer and skillset, you'll need the index in order for the indexer to run.
-## Consume enriched content
+## Consuming enriched content
The output of AI enrichment is either a [fully text-searchable index](search-what-is-an-index.md) on Azure Cognitive Search, or a [knowledge store](knowledge-store-concept-intro.md) in Azure Storage.
-### Accessing content in a search index
-
-[**Querying the index**](search-query-overview.md) is how developers and users access the enriched content generated by the pipeline. The index is like any other you might create for Azure Cognitive Search: you can supplement text analysis with custom analyzers, invoke fuzzy search queries, add filters, or experiment with scoring profiles to tune search relevance.
+### Check content in a search index
-### Accessing content in a knowledge store
+[Run queries](search-query-overview.md) to access the enriched content generated by the pipeline. The index is like any other you might create for Azure Cognitive Search: you can supplement text analysis with custom analyzers, invoke fuzzy search queries, add filters, or experiment with scoring profiles to tune search relevance.
-In Azure Storage, a [knowledge store](knowledge-store-concept-intro.md) has two manifestations: a blob container of JSON document, a blob container of image objects, or tables in Table storage. You can use [Storage Browser](knowledge-store-view-storage-explorer.md), [Power BI](knowledge-store-connect-power-bi.md), or any app that connects to Azure Storage.
+### Check content in a knowledge store
-+ A blob container captures enriched documents in their entirety, which is useful if you want to feed into other processes.
+In Azure Storage, a [knowledge store](knowledge-store-concept-intro.md) can assume the following forms: a blob container of JSON documents, a blob container of image objects, or tables in Table Storage. You can use [Storage Browser](knowledge-store-view-storage-explorer.md), [Power BI](knowledge-store-connect-power-bi.md), or any app that connects to Azure Storage to access your content.
-+ In contrast, Table storage can accommodate physical projections of enriched documents. You can create slices or layers of enriched documents that include or exclude specific parts. For analysis in Power BI, the tables in Azure Table Storage become the data source for further visualization and exploration.
++ A blob container captures enriched documents in their entirety, which is useful if you're creating a feed into other processes.
-An enriched document at the output of the pipeline differs from its original source input by the presence of additional fields containing new information that was extracted or generated during enrichment. As such, you can work with a combination of original and created content, regardless of which output structure you use.
++ A table is useful if you need slices of enriched documents, or if you want to include or exclude specific parts of the output. For analysis in Power BI, tables are the recommended data source for data exploration and visualization in Power BI. ## Checklist: A typical workflow
-1. When beginning a project, it's helpful to work with a subset of data. Indexer and skillset design is an iterative process, and you'll iterate more quickly if you're working with a small representative data set.
+1. When beginning a project, it's helpful to work with a subset of data. Indexer and skillset design is an iterative process, and the work goes faster with a small representative data set.
1. Create a [data source](/rest/api/searchservice/create-data-source) that specifies a connection to your data.
An enriched document at the output of the pipeline differs from its original sou
1. Create an [index schema](/rest/api/searchservice/create-index) that defines a search index.
-1. Create an [indexer](/rest/api/searchservice/create-indexer) to bring all of the above components together. Creating or running indexer retrieves the data, runs the skillset, and loads the index.
+1. Create an [indexer](/rest/api/searchservice/create-indexer) to bring all of the above components together. This step retrieves the data, runs the skillset, and loads the index.
1. Run queries to evaluate results and modify code to update skillsets, schema, or indexer configuration.
-To iterate over the above steps, [reset the indexer](search-howto-reindex.md) before rebuilding the pipeline, or delete and recreate the objects on each run (recommended if you are using the free tier). You should also [enable enrichment caching](cognitive-search-incremental-indexing-conceptual.md) to reuse existing enrichments wherever possible.
+To repeat any of the above steps, [reset the indexer](search-howto-reindex.md) before you run it. Or, delete and recreate the objects on each run (recommended if youΓÇÖre using the free tier). You should also [enable enrichment caching](cognitive-search-incremental-indexing-conceptual.md) to reuse existing enrichments wherever possible.
## Next steps
search Cognitive Search Defining Skillset https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/cognitive-search-defining-skillset.md
Title: Create a skillset
-description: Define data extraction, natural language processing, or image analysis steps to enrich and extract structured information from your data for use in Azure Cognitive Search.
+description: A skillset defines data extraction, natural language processing, and image analysis steps. A skillset is attached to indexer. It's used to enrich and extract information from source data for use in Azure Cognitive Search.
Previously updated : 08/15/2021 Last updated : 01/31/2022 # Create a skillset in Azure Cognitive Search ![indexer stages](media/cognitive-search-defining-skillset/indexer-stages-skillset.png "indexer stages")
-A skillset defines the operations that extract and enrich data to make it searchable. A skillset executes after document cracking, when text and image content are extracted from source documents, and after any fields from the source document are (optionally) mapped to destination fields in an index or knowledge store.
+A skillset defines the operations that extract and enrich data to make it searchable. It executes after text and images are extracted, and after [field mappings](search-indexer-field-mappings.md) are processed.
-In this article, you'll learn the steps of creating a skillset. For reference, this article uses the [Create Skillset (REST API)](/rest/api/searchservice/create-skillset).
-
-Some usage rules for skillsets include the following:
+This article explains how to create a skillset with the [Create Skillset (REST API)](/rest/api/searchservice/create-skillset). Rules for skillset definition include:
+ A skillset is a top-level resource, which means it can be created once and referenced by many indexers. + A skillset must contain at least one skill. + A skillset can repeat skills of the same type (for example, multiple Shaper skills).
-Recall that indexers drive skillset execution, which means that you will also need to create an [indexer](search-howto-create-indexers.md), [data source](search-data-sources-gallery.md), and [search index](search-what-is-an-index.md) before you can test your skillset.
+An indexer drives skillset execution. You need an [indexer](search-howto-create-indexers.md), [data source](search-data-sources-gallery.md), and [search index](search-what-is-an-index.md) before you can test your skillset.
> [!TIP] > Enable [enrichment caching](cognitive-search-incremental-indexing-conceptual.md) to reuse the content you've already processed and lower the cost of development. ## Skillset definition
-Start with the basic structure. In the [REST API](/rest/api/searchservice/create-skillset), a skillset is authored in JSON and has the following sections:
+Start with the basic structure. In the [Create Skillset REST API](/rest/api/searchservice/create-skillset), the body of the request is authored in JSON and has the following sections:
```json {
Start with the basic structure. In the [REST API](/rest/api/searchservice/create
After the name and description, a skillset has four main properties:
-+ `skills` array, an unordered [collection of skills](cognitive-search-predefined-skills.md), for which the search service determines the sequence of execution based on the inputs required for each skill. If skills are independent, they will execute in parallel. Skills can be utilitarian (like splitting text), transformational (based on AI from Cognitive Services), or custom skills that you provide. An example of a skills array is provided in the following section.
++ `skills` array, an unordered [collection of skills](cognitive-search-predefined-skills.md), for which the search service determines the sequence of execution based on the inputs required for each skill. If skills are independent, they execute in parallel. Skills can be utilitarian (like splitting text), transformational (based on AI from Cognitive Services), or custom skills that you provide. An example of a skills array is provided in the next section. + `cognitiveServices` is used for [billable skills](cognitive-search-predefined-skills.md) that call Cognitive Services APIs. Remove this section if you aren't using billable skills or Custom Entity Lookup. [Attach a resource](cognitive-search-attach-cognitive-services.md) if you are.
-+ `knowledgeStore`, (optional) specifies an Azure Storage account and settings for projecting skillset output into tables, blobs, and files in Azure Storage. Remove this section if you don't need it, otherwise [specify a knowledge store](knowledge-store-create-rest.md).
++ `knowledgeStore` (optional) specifies an Azure Storage account and settings for projecting skillset output into tables, blobs, and files in Azure Storage. Remove this section if you don't need it, otherwise [specify a knowledge store](knowledge-store-create-rest.md).
-+ `encryptionKey`, (optional) specifies an Azure Key Vault and [customer-managed keys](search-security-manage-encryption-keys.md) used to encrypt sensitive content in a skillset definition. Remove this property if you aren't using customer-managed encryption.
++ `encryptionKey` (optional) specifies an Azure Key Vault and [customer-managed keys](search-security-manage-encryption-keys.md) used to encrypt sensitive content in a skillset definition. Remove this property if you aren't using customer-managed encryption. ## Add a skills array
-Within a skillset definition, the skills array specifies which skills to execute. The following example introduces you to its composition by showing you two unrelated, [built-in skills](cognitive-search-predefined-skills.md). Notice that each skill has a type, context, inputs, and outputs.
+Within a skillset definition, the skills array specifies which skills to execute. The following example shows two unrelated, [built-in skills](cognitive-search-predefined-skills.md). Notice that each skill has a type, context, inputs, and outputs.
```json "skills":[
Within a skillset definition, the skills array specifies which skills to execute
``` > [!NOTE]
-> You can build complex skillsets with looping and branching, using the [Conditional skill](cognitive-search-skill-conditional.md) to create the expressions. The syntax is based on the [JSON Pointer](https://tools.ietf.org/html/rfc6901) path notation, with a few modifications to identify nodes in the enrichment tree. A `"/"` traverses a level lower in the tree and `"*"` acts as a for-each operator in the context. Numerous examples in this article illustrate the syntax.
+> You can build complex skillsets with looping and branching using the [Conditional skill](cognitive-search-skill-conditional.md) to create the expressions. The syntax is based on the [JSON Pointer](https://tools.ietf.org/html/rfc6901) path notation, with a few modifications to identify nodes in the enrichment tree. A `"/"` traverses a level lower in the tree and `"*"` acts as a for-each operator in the context. Numerous examples in this article illustrate the syntax.
### How built-in skills are structured
-Each skill is unique in terms of its input values and the parameters it takes. The [documentation for each skill](cognitive-search-predefined-skills.md) describes all of the parameters and properties of a given skill. Although there are differences, most skills share a common set and are similarly patterned. To illustrate several points, the [Entity Recognition skill](cognitive-search-skill-entity-recognition-v3.md) provides an example:
+Each skill is unique in terms of its input values and the parameters that it takes. The [documentation for each skill](cognitive-search-predefined-skills.md) describes all of the parameters and properties of a given skill. Although there are differences, most skills share a common set and are similarly patterned. To illustrate several points, the [Entity Recognition skill](cognitive-search-skill-entity-recognition-v3.md) provides an example:
```json {
Common parameters include "odata.type", "inputs", and "outputs". The other param
+ **"odata.type"** uniquely identifies each skill. You can find the type in the [skill reference documentation](cognitive-search-predefined-skills.md).
-+ **"context"** is a node in an enrichment tree and it represents the level at which operations take place. All skills have this property. If the "context" field is not explicitly set, the default context is `"/document"`. In the example, the context is the whole document, meaning that the entity recognition skill is called once per document.
++ **"context"** is a node in an enrichment tree and it represents the level at which operations take place. All skills have this property. If the "context" field is not explicitly set, the default context is `"/document"`. In the example, the context is the whole document, which means that the entity recognition skill is called once per document.
- The context also determines where outputs are also produced in the enrichment tree. In this example, the skill returns a property called `"organizations"`, captured as `orgs`, which is added as a child node of `"/document"`. In downstream skills, the path to this newly-created enrichment node is `"/document/orgs"`. For a particular document, the value of `"/document/orgs"` is an array of organizations extracted from the text (for example: `["Microsoft", "LinkedIn"]`). For more information about path syntax, see [Referencing annotations in a skillset](cognitive-search-concept-annotations-syntax.md).
+ The context also determines where outputs are produced in the enrichment tree. In this example, the skill returns a property called `"organizations"`, captured as `orgs`, which is added as a child node of `"/document"`. In downstream skills, the path to this node is `"/document/orgs"`. For a particular document, the value of `"/document/orgs"` is an array of organizations extracted from the text (for example: `["Microsoft", "LinkedIn"]`). For more information about path syntax, see [How to reference annotations in a skillset](cognitive-search-concept-annotations-syntax.md).
-+ **"inputs"** specify the origin of the incoming data and how it will be used. In the case of Entity Recognition, one of the inputs is `"text"`, which is the content to be analyzed for entities. The content is sourced from the `"/document/content"` node in an enrichment tree. In an enrichment tree, `"/document"` is the root node. For documents retrieved using an Azure Blob indexer, the `content` field of each document is a standard field created by the indexer.
++ **"inputs"** specify the origin of the incoming data and how it's used. In the [Entity Recognition](cognitive-search-skill-entity-recognition-v3.md) skill, one of the inputs is `"text"`, which is the content to be analyzed for entities. The content is sourced from the `"/document/content"` node in an enrichment tree. In an enrichment tree, `"/document"` is the root node. For documents retrieved by an Azure Blob indexer, the `content` field of each document is a standard field created by the indexer.
-+ **"outputs"** represent the output of the skill. Each skill is designed to emit specific kinds of output, which are referenced by name in the skillset. In the case of Entity Recognition, `"organizations"` is one of the outputs it supports. The documentation for each skill describes the outputs it can produce.
++ **"outputs"** represent the output of the skill. Each skill is designed to emit specific kinds of output, which are referenced by name in the skillset. In Entity Recognition, `"organizations"` is one of the outputs it supports. The documentation for each skill describes the outputs it can produce.
-Outputs exist only during processing. To chain this output to a downstream skill's input, reference the output as `"/document/orgs"`. To send output to a field in a search index, [create an output field mapping](cognitive-search-output-field-mapping.md) in an indexer. To send output to a knowledge store, [create a projection](knowledge-store-projection-overview.md).
+Outputs exist only during processing. To chain this output to the input of a downstream skill, reference the output as `"/document/orgs"`. To send output to a field in a search index, [create an output field mapping](cognitive-search-output-field-mapping.md) in an indexer. To send output to a knowledge store, [create a projection](knowledge-store-projection-overview.md).
-Outputs from the one skill can conflict with outputs from a different skill. If you have multiple skills returning the same output, use the `"targetName"` for name disambiguation in enrichment node paths.
+Outputs from the one skill can conflict with outputs from a different skill. If you have multiple skills that return the same output, use the `"targetName"` for name disambiguation in enrichment node paths.
Some situations call for referencing each element of an array separately. For example, suppose you want to pass *each element* of `"/document/orgs"` separately to another skill. To do so, add an asterisk to the path: `"/document/orgs/*"`
The second skill for sentiment analysis follows the same pattern as the first en
} ```
-## Adding a custom skill
+## Add a custom skill
-Below is an example of a [custom skill](cognitive-search-custom-skill-web-api.md). The URI points to an Azure Function, which in turn invokes the model or transformation that you provide. For more information, see [Define a custom interface](cognitive-search-custom-skill-interface.md).
+This section includes an example of a [custom skill](cognitive-search-custom-skill-web-api.md). The URI points to an Azure Function, which in turn invokes the model or transformation that you provide. For more information, see [Define a custom interface](cognitive-search-custom-skill-interface.md).
Although the custom skill is executing code that is external to the pipeline, in a skills array, it's just another skill. Like the built-in skills, it has a type, context, inputs, and outputs. It also reads and writes to an enrichment tree, just as the built-in skills do. Notice that the "context" field is set to `"/document/orgs/*"` with an asterisk, meaning the enrichment step is called *for each* organization under `"/document/orgs"`.
-Output, in this case a company description, is generated for each organization identified. When referring to the description in a downstream step (for example, in key phrase extraction), you would use the path `"/document/orgs/*/companyDescription"` to do so.
+Output, such as the company description in this example, is generated for each organization that's identified. When referring to the node in a downstream step (for example, in key phrase extraction), you would use the path `"/document/orgs/*/companyDescription"` to do so.
```json {
Output, in this case a company description, is generated for each organization i
## Send output to an index
-As each skill executes, its output is added as nodes in a document's enrichment tree. Enriched documents exist in the pipeline as temporary data structures. To create a permanent data structure, and gain full visibility into what a skill is actually producing, you will need to send the output to a search index or a [knowledge store](knowledge-store-concept-intro.md).
+As each skill executes, its output is added as nodes in a document's enrichment tree. Enriched documents exist in the pipeline as temporary data structures. To create a permanent data structure, and gain full visibility into what a skill is actually producing, send the output to a search index or a [knowledge store](knowledge-store-concept-intro.md).
-In the early stages of skillset evaluation, you'll want to check preliminary results with minimal effort. We recommend the search index because it's simpler to set up. For each skill output, [define an output field mapping](cognitive-search-output-field-mapping.md) in the indexer, and a field in the index.
+In the early stages of skillset evaluation, checking preliminary results is important. We recommend a search index over a knowledge store because it's simpler to set up. For each skill output, [define an output field mapping](cognitive-search-output-field-mapping.md) in the indexer, and a field in the search index.
-After running the indexer, you can use [Search Explorer](search-explorer.md) to return documents from the index and check the contents of each field to determine what the skillset detected or created.
+After you run the indexer, use [Search Explorer](search-explorer.md) to return documents from the index and check the contents of each field to determine what the skillset detected or created.
-The following example shows the results of an entity recognition skill that detected persons, locations, organizations, and other entities in a chunk of text. Viewing the results in Search Explorer can help you determine whether a skill adds value to your solution.
+This screenshot shows the results of an entity recognition skill that detected persons, locations, organizations, and other entities in a chunk of text. You can view the results to decide whether a skill adds value to your solution.
:::image type="content" source="media/cognitive-search-defining-skillset/doc-in-search-explorer.png" alt-text="Screenshot of a document in Search Explorer."::: ## Tips for a first skillset
-+ Assemble a representative sample of your content in Blob Storage or another supported indexer data source and run the **Import data** wizard to create the skillset, index, indexer, and data source object.
++ Assemble a representative sample of your content in Blob Storage or another supported data source and run the [**Import data** wizard](search-import-data-portal.md).
- The wizard automates several steps that can be challenging the first time around, including defining the fields in an index, defining output filed mappings in an indexer, and projections in a knowledge store if you are using one. For some skills, such as OCR or image analysis, the wizard will add utility skills that merge image and text content that was separated during document cracking.
+ The wizard automates several steps that can be challenging the first time around. It defines fields in an index, field mappings in an indexer, and projections in a knowledge store if you are using one. For some skills, such as OCR or image analysis, the wizard adds utility skills that merge the image and text content that was separated during document cracking.
-+ Alternatively, you can import skill Postman collections that provide full examples of all of the object definitions required to evaluate a skill, from skillset to an index that you can query to view the results of a transformation.
++ Alternatively, you can [import sample Postman collections](https://github.com/Azure-Samples/azure-search-postman-samples) that provide a full articulation of the object definitions required to evaluate a skill. ## Next steps
-Context and input source fields are paths to nodes in an enrichment tree. As a next step, learn more about the syntax for setting up paths to nodes in an enrichment tree.
+Context and input source fields are paths to nodes in an enrichment tree. As a next step, learn more about the path syntax for nodes in an enrichment tree.
> [!div class="nextstepaction"] > [Referencing annotations in a skillset](cognitive-search-concept-annotations-syntax.md)
search Search Blob Storage Integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/search-blob-storage-integration.md
Once the index is created and populated, it exists independently of your blob co
You need both Azure Cognitive Search and Azure Blob Storage. Within blob storage, you need a container that provides source content.
-You can start directly in your Storage account portal page. In the left navigation page, under **Blob service** select **Add Azure Cognitive Search** to create a new service or select an existing one.
+You can start directly in your Storage Account portal page.
-Once you add Azure Cognitive Search to your storage account, you can follow the standard process to index blob data. We recommend the **Import data** wizard in Azure Cognitive Search for an easy initial introduction, or call the REST APIs using a tool like Postman. This tutorial walks you through the steps of calling the REST API in Postman: [Index and search semi-structured data (JSON blobs) in Azure Cognitive Search](search-semi-structured-data.md).
+1. In the left navigation page under **Data management**, select **Azure search** to select or create a search service.
+
+1. Follow the steps in the wizard to extract and optionally create searchable content from your blobs. The workflow is the [**Import data** wizard](cognitive-search-quickstart-blob.md).
+
+ :::image type="content" source="media/search-blob-storage-integration/blob-blade.png" alt-text="Screenshot of the Azure search wizard in the Azure Storage portal page." border="true":::
+
+1. Use [Search explorer](search-explorer.md) in the search portal page to query your content.
+
+The wizard is the best place to start, but you'll discover more flexible options when you [configure a blob indexer](search-howto-indexing-azure-blob-storage.md) yourself. You can call the REST APIs using a tool like Postman or Visual Studio Code. [Tutorial: Index and search semi-structured data (JSON blobs) in Azure Cognitive Search](search-semi-structured-data.md) walks you through the steps of calling the REST API in Postman.
## Use a Blob indexer An *indexer* is a data-source-aware subservice in Cognitive Search, equipped with internal logic for sampling data, reading metadata data, retrieving data, and serializing data from native formats into JSON documents for subsequent import.
-Blobs in Azure Storage are indexed using the [Azure Cognitive Search Blob storage indexer](search-howto-indexing-azure-blob-storage.md). You can invoke this indexer by using the **Import data** wizard, a REST API, or the .NET SDK. In code, you use this indexer by setting the type, and by providing connection information that includes an Azure Storage account along with a blob container. You can subset your blobs by creating a virtual directory, which you can then pass as a parameter, or by filtering on a file type extension.
+Blobs in Azure Storage are indexed using the [blob indexer](search-howto-indexing-azure-blob-storage.md). You can invoke this indexer by using the **Azure search** command in Azure Storage, the **Import data** wizard, a REST API, or the .NET SDK. In code, you use this indexer by setting the type, and by providing connection information that includes an Azure Storage account along with a blob container. You can subset your blobs by creating a virtual directory, which you can then pass as a parameter, or by filtering on a file type extension.
An indexer ["cracks a document"](search-indexer-overview.md#document-cracking), opening a blob to inspect content. After connecting to the data source, it's the first step in the pipeline. For blob data, this is where PDF, Office docs, and other content types are detected. Document cracking with text extraction is no charge. If your blobs contain image content, images are ignored unless you [add AI enrichment](cognitive-search-concept-intro.md). Standard indexing applies only to text content.
-The Blob indexer comes with configuration parameters and supports change tracking if the underlying data provides sufficient information. You can learn more about the core functionality in [Azure Cognitive Search Blob storage indexer](search-howto-indexing-azure-blob-storage.md).
+The Blob indexer comes with configuration parameters and supports change tracking if the underlying data provides sufficient information. You can learn more about the core functionality in [Blob indexer](search-howto-indexing-azure-blob-storage.md).
### Supported access tiers
By running a Blob indexer over a container, you can extract text and metadata fr
### Indexing blob metadata
-A common scenario that makes it easy to sort through blobs of any content type is to index both custom metadata and system properties for each blob. In this way, information for all blobs is indexed regardless of document type, stored in an index in your search service. Using your new index, you can then proceed to sort, filter, and facet across all Blob storage content.
+A common scenario that makes it easy to sort through blobs of any content type is to [index both custom metadata and system properties](search-blob-metadata-properties.md) for each blob. In this way, information for all blobs is indexed regardless of document type, stored in an index in your search service. Using your new index, you can then proceed to sort, filter, and facet across all Blob storage content.
> [!NOTE] > Blob Index tags are natively indexed by the Blob storage service and exposed for querying. If your blobs' key/value attributes require indexing and filtering capabilities, Blob Index tags should be leveraged instead of metadata.
Indexers can be configured to extract structured content found in blobs that con
## Search blob content in a search index
-The output of an indexer is a search index, used for interactive exploration using free text and filtered queries in a client app. For initial exploration and verification of content, we recommend starting with [Search Explorer](search-explorer.md) in the portal to examine document structure. You can use [simple query syntax](query-simple-syntax.md), [full query syntax](query-lucene-syntax.md), and [filter expression syntax](query-odata-filter-orderby-syntax.md) in Search explorer.
+The output of an indexer is a search index, used for interactive exploration using free text and filtered queries in a client app. For initial exploration and verification of content, we recommend starting with [Search Explorer](search-explorer.md) in the portal to examine document structure. In Search explorer, you can use:
+++ [Simple query syntax](query-simple-syntax.md)++ [Full query syntax](query-lucene-syntax.md)++ [Filter expression syntax](query-odata-filter-orderby-syntax.md) A more permanent solution is to gather query inputs and present the response as search results in a client application. The following C# tutorial explains how to build a search application: [Create your first application in Azure Cognitive Search](tutorial-csharp-create-first-app.md).
search Search Howto Managed Identities Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/search-howto-managed-identities-cosmos-db.md
Title: Set up a connection to a Cosmos DB account using a managed identity
+ Title: Connect to Cosmos DB
description: Learn how to set up an indexer connection to a Cosmos DB account using a managed identity
search Search Howto Managed Identities Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/search-howto-managed-identities-data-sources.md
Title: Set up a connection to a data source using a managed identity
+ Title: Use a managed identity for indexer connections
description: Learn how to set up an indexer connection to a data source using a managed identity
search Search Howto Managed Identities Sql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/search-howto-managed-identities-sql.md
Title: Set up a connection to Azure SQL Database using a managed identity
+ Title: Connect to Azure SQL
description: Learn how to set up an indexer connection to Azure SQL Database using a managed identity
search Search Howto Managed Identities Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/search-howto-managed-identities-storage.md
Title: Set up a connection to a storage account using a managed identity
+ Title: Connect to Azure Storage
description: Learn how to set up an indexer connection to an Azure Storage account using a managed identity
search Search What Is Data Import https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/search/search-what-is-data-import.md
Title: Import and data ingestion in search indexes
+ Title: Data import and data ingestion
description: Populate and upload data to an index in Azure Cognitive Search from external data sources.
Last updated 12/07/2021
-# Data import overview - Azure Cognitive Search
+# Data import in Azure Cognitive Search
-In Azure Cognitive Search, queries execute over user-owned content that's loaded into a [search index](search-what-is-an-index.md). This article describes the two basic workflows for populating an index: *push* your data into the index programmatically, or point an [Azure Cognitive Search indexer](search-indexer-overview.md) at a supported data source to *pull* in the data.
+In Azure Cognitive Search, queries execute over user-owned content that's loaded into a [search index](search-what-is-an-index.md). This article describes the two basic workflows for populating an index: *push* your data into the index programmatically, or *pull* in the data using a [search indexer](search-indexer-overview.md).
-With either approach, the objective is to load data from an external data source into an Azure Cognitive Search index. Azure Cognitive Search will let you create an empty index, but until you push or pull data into it, it's not queryable.
+With either approach, the objective is to load data from an external data source. Although you can create an empty index, it's not queryable until you add the content.
> [!NOTE] > If [AI enrichment](cognitive-search-concept-intro.md) is a solution requirement, you must use the pull model (indexers) to load an index. External processing is supported only through skillsets attached to an indexer. ## Pushing data to an index
-The push model, used to programmatically send your data to Azure Cognitive Search, is the most flexible approach. First, it has no restrictions on data source type. Any dataset composed of JSON documents can be pushed to an Azure Cognitive Search index, assuming each document in the dataset has fields mapping to fields defined in your index schema. Second, it has no restrictions on frequency of execution. You can push changes to an index as often as you like. For applications having very low latency requirements (for example, if you need search operations to be in sync with dynamic inventory databases), the push model is your only option.
+The push model, used to programmatically send your data to Azure Cognitive Search, is the most flexible approach for the following reasons:
-This approach is more flexible than the pull model because you can upload documents individually or in batches (up to 1000 per batch or 16 MB, whichever limit comes first). The push model also allows you to upload documents to Azure Cognitive Search regardless of where your data is.
++ First, there are no restrictions on data source type. The dataset must be composed of JSON documents that map to your index schema, but the data can come from anywhere. +++ Second, there are no restrictions on frequency of execution. You can push changes to an index as often as you like. For applications having low latency requirements (for example, if you need search operations to be in sync with dynamic inventory databases), the push model is your only option.+++ Third, you can upload documents individually or in batches up to 1000 per batch, or 16 MB per batch, whichever limit comes first. ### How to push data to an Azure Cognitive Search index
Whether you use the REST API or an SDK, the following document operations are su
+ **Upload**, similar to an "upsert" where the document is inserted if it is new, and updated or replaced if it exists. If the document is missing values that the index requires, the document field's value is set to null.
-+ **merge** updates a document that already exists and will fail if the document cannot be found. Any field you specify in a merge will replace the existing field value in the index. Be sure to check for collection fields that contain multiple values, such as fields of type `Collection(Edm.String)`. For example, if `tags` fields starts with a value of `["budget"]` and you execute a merge with value `["economy", "pool"]`, the final value of the `tags` field will be `["economy", "pool"]`. It will not be `["budget", "economy", "pool"]`.
++ **merge** updates a document that already exists, and fails a document that cannot be found. Merge replaces existing values. For this reason, be sure to check for collection fields that contain multiple values, such as fields of type `Collection(Edm.String)`. For example, if a `tags` field starts with a value of `["budget"]` and you execute a merge with `["economy", "pool"]`, the final value of the `tags` field is `["economy", "pool"]`. It won't be `["budget", "economy", "pool"]`.
-+ **mergeOrUpload** behaves like **merge** if a document with the given key already exists in the index, and **upload** if the document is new.
++ **mergeOrUpload** behaves like **merge** if the document exists, and **upload** if the document is new.
-+ **delete** removes the entire document from the index. If you want to remove an individual field from a document, use **merge** instead, setting the field in question to null.
++ **delete** removes the entire document from the index. If you want to remove an individual field, use **merge** instead, setting the field in question to null. ## Pulling data into an index
The pull model crawls a supported data source and automatically uploads the data
+ [SharePoint in Microsoft 365 (preview)](search-howto-index-sharepoint-online.md) + [Power Query data connectors (preview)](search-how-to-index-power-query-data-sources.md)
-Indexers connect an index to a data source (usually a table, view, or equivalent structure), and map source fields to equivalent fields in the index. During execution, the rowset is automatically transformed to JSON and loaded into the specified index. All indexers support scheduling so that you can specify how frequently the data is to be refreshed. Most indexers provide change tracking if the data source supports it. By tracking changes and deletes to existing documents in addition to recognizing new documents, indexers remove the need to actively manage the data in your index.
+Indexers connect an index to a data source (usually a table, view, or equivalent structure), and map source fields to equivalent fields in the index. During execution, the rowset is automatically transformed to JSON and loaded into the specified index. All indexers support schedules so that you can specify how frequently the data is to be refreshed. Most indexers provide change tracking if the data source supports it. By tracking changes and deletes to existing documents in addition to recognizing new documents, indexers remove the need to actively manage the data in your index.
### How to pull data into an Azure Cognitive Search index Indexer functionality is exposed in the [Azure portal](search-import-data-portal.md), the [REST API](/rest/api/searchservice/Indexer-operations), and the [.NET SDK](/dotnet/api/azure.search.documents.indexes.searchindexerclient).
-An advantage to using the portal is that Azure Cognitive Search can usually generate a default index schema for you by reading the metadata of the source dataset. You can modify the generated index until the index is processed, after which the only schema edits allowed are those that do not require reindexing. If the changes you want to make impact the schema directly, you would need to rebuild the index.
+An advantage to using the portal is that Azure Cognitive Search can usually generate a default index schema by reading the metadata of the source dataset. You can modify the generated index until the index is processed, after which the only schema edits allowed are those that do not require reindexing. If the changes affect the schema itself, you would need to rebuild the index.
## Verify data import with Search explorer
sentinel Create Codeless Connector https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/sentinel/create-codeless-connector.md
After creating your [JSON configuration file](#create-a-connector-json-configura
1. Authenticate to the Azure API. For more information, see [Getting started with REST](/rest/api/azure/).
- 1. Invoke an `UPSERT` API call to Microsoft Sentinel to deploy your new connector. Your data connector is deployed to your Microsoft Sentinel workspace, and is available on the **Data connectors** page.
+ 1. Invoke a [CREATE or UPDATE](/rest/api/securityinsights/preview/data-connectors/create-or-update) API call to Microsoft Sentinel to deploy your new connector. In the request body, define the `kind` value as `APIPolling`.
+
+ Your data connector is deployed to your Microsoft Sentinel workspace, and is available on the **Data connectors** page.
After creating your [JSON configuration file](#create-a-connector-json-configura
# [Connect via API](#tab/connect-via-api)
- Use the `CONNECT` endpoint to send a PUT method and pass the JSON configuration directly in the body of the message. For more information, see [auth configuration](#auth-configuration).
+ Use the [CONNECT](/rest/api/securityinsights/preview/data-connectors/connect) endpoint to send a PUT method and pass the JSON configuration directly in the body of the message. For more information, see [auth configuration](#auth-configuration).
Use the following API attributes, depending on the [authType](#authtype) defined. For each `authType` parameter, all listed attributes are mandatory and are string values.
Use one of the following methods:
- **Azure portal**: In your Microsoft Sentinel data connector page, select **Disconnect**. -- **API**: Use the DISCONNECT API to send a PUT call with an empty body to the following URL:
+- **API**: Use the [DISCONNECT](/rest/api/securityinsights/preview/data-connectors/disconnect) API to send a PUT call with an empty body to the following URL:
```rest https://management.azure.com /subscriptions/{{SUB}}/resourceGroups/{{RG}}/providers/Microsoft.OperationalInsights/workspaces/{{WS-NAME}}/providers/Microsoft.SecurityInsights/dataConnectors/{{Connector_Id}}/disconnect?api-version=2021-03-01-preview
sentinel Data Connectors Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/sentinel/data-connectors-reference.md
Add http://localhost:8081/ under **Authorized redirect URIs** while creating [We
| **Data ingestion method** | **Azure service-to-service integration: <br>[API-based connections](connect-azure-windows-microsoft-services.md#api-based-connections)**<br><br>Also available in the [Microsoft 365 Insider Risk Management solution](sentinel-solutions-catalog.md#domain-solutions) | | **License and other prerequisites** | <ul><li>Valid subscription for Microsoft 365 E5/A5/G5, or their accompanying Compliance or IRM add-ons.<li>[Microsoft 365 Insider risk management](/microsoft-365/compliance/insider-risk-management) fully onboarded, and [IRM policies](/microsoft-365/compliance/insider-risk-management-policies) defined and producing alerts.<li>[Microsoft 365 IRM configured](/microsoft-365/compliance/insider-risk-management-settings#export-alerts-preview) to enable the export of IRM alerts to the Office 365 Management Activity API in order to receive the alerts through the Microsoft Sentinel connector.) | **Log Analytics table(s)** | SecurityAlert |
-| **Data query filter** | `SecurityAlert`<br>`\| where ProductName == "Microsoft 365 Insider Risk Management"` |
+| **Data query filter** | `SecurityAlert`<br>`| where ProductName == "Microsoft 365 Insider Risk Management"` |
| **Supported by** | Microsoft | | | |
-## Microsoft Project
-| Connector attribute | Description |
-| | |
-| **Data ingestion method** | **Azure service-to-service integration: <br>[API-based connections](connect-azure-windows-microsoft-services.md#api-based-connections)** |
-| **License prerequisites/<br>Cost information** | Your Office 365 deployment must be on the same tenant as your Microsoft Sentinel workspace.<br>Other charges may apply |
-| **Log Analytics table(s)** | ProjectActivity |
-| **Supported by** | Microsoft |
-| | |
- ## Microsoft Defender for Cloud | Connector attribute | Description |
Add http://localhost:8081/ under **Authorized redirect URIs** while creating [We
| Connector attribute | Description | | | | | **Data ingestion method** | **Azure service-to-service integration: <br>[API-based connections](connect-azure-windows-microsoft-services.md#api-based-connections)** |
-| **License prerequisites/<br>Cost information** | Your Office 365 deployment must be on the same tenant as your Microsoft Sentinel workspace.<br>Other charges may apply |
+| **License prerequisites/<br>Cost information** | Your Office 365 deployment must be on the same tenant as your Microsoft Sentinel workspace.<br>Other charges may apply. |
| **Log Analytics table(s)** | OfficeActivity | | **Supported by** | Microsoft | | | |
-## Microsoft Power BI
+## Microsoft Power BI (Preview)
| Connector attribute | Description | | | | | **Data ingestion method** | **Azure service-to-service integration: <br>[API-based connections](connect-azure-windows-microsoft-services.md#api-based-connections)** |
-| **License prerequisites/<br>Cost information** | Your Office 365 deployment must be on the same tenant as your Microsoft Sentinel workspace.<br>Other charges may apply |
+| **License prerequisites/<br>Cost information** | Your Office 365 deployment must be on the same tenant as your Microsoft Sentinel workspace.<br>Other charges may apply. |
| **Log Analytics table(s)** | PowerBIActivity | | **Supported by** | Microsoft | | | |
+
+## Microsoft Project (Preview)
+| Connector attribute | Description |
+| | |
+| **Data ingestion method** | **Azure service-to-service integration: <br>[API-based connections](connect-azure-windows-microsoft-services.md#api-based-connections)** |
+| **License prerequisites/<br>Cost information** | Your Office 365 deployment must be on the same tenant as your Microsoft Sentinel workspace.<br>Other charges may apply. |
+| **Log Analytics table(s)** | ProjectActivity |
+| **Supported by** | Microsoft |
+| | |
## Microsoft Sysmon for Linux (Preview)
service-bus-messaging Authenticate Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/authenticate-application.md
Title: Authenticate an application to access Azure Service Bus entities description: This article provides information about authenticating an application with Azure Active Directory to access Azure Service Bus entities (queues, topics, etc.) Previously updated : 06/14/2021 Last updated : 01/06/2022 # Authenticate and authorize an application with Azure Active Directory to access Azure Service Bus entities Azure Service Bus supports using Azure Active Directory (Azure AD) to authorize requests to Service Bus entities (queues, topics, subscriptions, or filters). With Azure AD, you can use Azure role-based access control (Azure RBAC) to grant permissions to a security principal, which may be a user, group, or application service principal. To learn more about roles and role assignments, see [Understanding the different roles](../role-based-access-control/overview.md).
+> [!IMPORTANT]
+> You can disable local or SAS key authentication for a Service Bus namespace and allow only Azure Active Directory authentication. For step-by-step instructions, see [Disable local authentication](disable-local-authentication.md).
+ ## Overview When a security principal (a user, group, or application) attempts to access a Service Bus entity, the request must be authorized. With Azure AD, access to a resource is a two-step process.
service-bus-messaging Disable Local Authentication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/disable-local-authentication.md
+
+ Title: Disable local authentication with Azure Service Bus
+description: This article explains how to disable local or Shared Access Signature key authentication for a Service Bus namespace.
+ Last updated : 02/01/2022 ++
+# Disable local or shared access key authentication with Azure Service Bus
+There are two ways to authenticate to Azure Service Bus resources: Azure Active Directory (Azure AD) and Shared Access Signatures (SAS). Azure AD provides superior security and ease of use over shared access signatures (SAS). With Azure AD, thereΓÇÖs no need to store the tokens in your code and risk potential security vulnerabilities. We recommend that you use Azure AD with your Azure Service Bus applications when possible.
+
+This article explains how to disable SAS key authentication and use only Azure AD for authentication.
+
+## Use portal to disable local auth
+In this section, you learn how to use the Azure portal to disable local authentication.
+
+1. Navigate to your Service Bus namespace in the [Azure portal](https://portal.azure.com).
+1. In the **Essentials** section of the **Overview** page, select **Enabled**, for **Local Authentication**.
+
+ :::image type="content" source="./media/disable-local-authentication/portal-overview-enabled.png" alt-text="Image showing the Overview page of a Service Bus namespace with Local Authentication set to Enabled.":::
+1. On the **Local Authentication** page, select **Disabled**, and select **OK**.
+
+ :::image type="content" source="./media/disable-local-authentication/select-disabled.png" alt-text="Disable location.":::
+
+## Use Resource Manager template to disable local auth
+You can disable local authentication for a Service Bus namespace by setting `disableLocalAuth` property to `true` as shown in the following Azure Resource Manager template.
+
+```json
+{
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "namespace_name": {
+ "defaultValue": "spcontososbusns",
+ "type": "String"
+ }
+ },
+ "variables": {},
+ "resources": [
+ {
+ "type": "Microsoft.ServiceBus/namespaces",
+ "apiVersion": "2021-06-01-preview",
+ "name": "[parameters('namespace_name')]",
+ "location": "East US",
+ "sku": {
+ "name": "Standard",
+ "tier": "Standard"
+ },
+ "properties": {
+ "disableLocalAuth": true,
+ "zoneRedundant": false
+ }
+ }
+ ]
+}
+```
+
+### Parameters.json
+
+```json
+{
+ "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "namespace_name": {
+ "value": null
+ }
+ }
+}
+```
+
+## Azure policy
+You can assign the [disable local auth](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2Fcfb11c26-f069-4c14-8e36-56c394dae5af) Azure policy to an Azure subscription or a resource group to enforce disabling of local authentication for all Service Bus namespaces in the subscription or the resource group.
++
+## Next steps
+See the following to learn about Azure AD and SAS authentication.
+
+- [Authentication with SAS](service-bus-sas.md)
+- Authentication with Azure AD
+ - [Authenticate with managed identities](service-bus-managed-service-identity.md)
+ - [Authenticate from an application](authenticate-application.md)
service-bus-messaging Monitor Service Bus Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/monitor-service-bus-reference.md
The following management operations are captured in operational logs:
> [!NOTE] > Currently, *Read* operations aren't tracked in the operational logs. +
+## Runtime audit logs
+Runtime audit logs capture aggregated diagnostic information for all data plane access operations (such as send or receive messages) in Service Bus.
+
+> [!NOTE]
+> Runtime audit logs are currently available only in the **premium** tier.
+
+Runtime audit logs include the elements listed in the following table:
+
+Name | Description
+- | -
+`ActivityId` | A randomly generated UUID that ensures uniqueness for the audit activity.
+`ActivityName` | Runtime operation name.
+`ResourceId` | Resource associated with the activity.
+`Timestamp` | Aggregation time.
+`Status` | Status of the activity (success or failure).
+`Protocol` | Type of the protocol associated with the operation.
+`AuthType` | Type of authentication (Azure Active Directory or SAS Policy).
+`AuthKey` | Azure Active Directory application ID or SAS policy name that's used to authenticate to a resource.
+`NetworkType` | Type of the network access: `Public` or`Private`.
+`ClientIP` | IP address of the client application.
+`Count` | Total number of operations performed during the aggregated period of 1 minute.
+`Properties` | Metadata that are specific to the data plane operation.
+`Category` | Log category
+
+Here's an example of a runtime audit log entry:
+
+```json
+{
+ "ActivityId": "<activity id>",
+ "ActivityName": "ConnectionOpen | Authorization | SendMessage | ReceiveMessage",
+ "ResourceId": "/SUBSCRIPTIONS/xxx/RESOURCEGROUPS/<Resource Group Name>/PROVIDERS/MICROSOFT.SERVICEBUS/NAMESPACES/<Service Bus namespace>/servicebus/<service bus name>",
+ "Time": "1/1/2021 8:40:06 PM +00:00",
+ "Status": "Success | Failure",
+ "Protocol": "AMQP | HTTP | SBMP",
+ "AuthType": "SAS | AAD",
+ "AuthId": "<AAD Application Name| SAS policy name>",
+ "NetworkType": "Public | Private",
+ "ClientIp": "x.x.x.x",
+ "Count": 1,
+ "Category": "RuntimeAuditLogs"
+ }
+
+```
+ ## Azure Monitor Logs tables Azure Service Bus uses Kusto tables from Azure Monitor Logs. You can query these tables with Log Analytics. For a list of Kusto tables the service uses, see [Azure Monitor Logs table reference](/azure/azure-monitor/reference/tables/tables-resourcetype#service-bus).
service-bus-messaging Monitor Service Bus https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/monitor-service-bus.md
Following are sample queries that you can use to help you monitor your Azure Ser
| where Category == "OperationalLogs" | summarize count() by EventName_s, _ResourceId ```++ Get runtime audit logs generated in the last one hour. +
+ ```Kusto
+ AzureDiagnostics
+ | where TimeGenerated > ago(1h)
+ | where ResourceProvider =="MICROSOFT.SERVICEBUS"
+ | where Category == "RuntimeAuditLogs"
+ ```
+ + Get access attempts to a key vault that resulted in "key not found" error.
service-bus-messaging Service Bus Authentication And Authorization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/service-bus-authentication-and-authorization.md
Title: Azure Service Bus authentication and authorization | Microsoft Docs description: Authenticate apps to Service Bus with Shared Access Signature (SAS) authentication. Previously updated : 09/15/2021 Last updated : 02/01/2022 # Service Bus authentication and authorization
For more information about authenticating with Azure AD, see the following artic
> [!IMPORTANT] > Authorizing users or applications using OAuth 2.0 token returned by Azure AD provides superior security and ease of use over shared access signatures (SAS). With Azure AD, there is no need to store the tokens in your code and risk potential security vulnerabilities. We recommend that you use Azure AD with your Azure Service Bus applications when possible.
+>
+> You can disable local or SAS key authentication for a Service Bus namespace and allow only Azure AD authentication. For step-by-step instructions, see [Disable local authentication](disable-local-authentication.md).
## Shared access signature [SAS authentication](service-bus-sas.md) enables you to grant a user access to Service Bus resources, with specific rights. SAS authentication in Service Bus involves the configuration of a cryptographic key with associated rights on a Service Bus resource. Clients can then gain access to that resource by presenting a SAS token, which consists of the resource URI being accessed and an expiry signed with the configured key.
service-bus-messaging Service Bus Managed Service Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/service-bus-managed-service-identity.md
Title: Managed identities for Azure resources with Service Bus description: This article describes how to use managed identities to access with Azure Service Bus entities (queues, topics, and subscriptions). Previously updated : 06/14/2021 Last updated : 01/06/2022
With managed identities, the Azure platform manages this runtime identity. You do not need to store and protect access keys in your application code or configuration, either for the identity itself, or for the resources you need to access. A Service Bus client app running inside an Azure App Service application or in a virtual machine with enabled managed entities for Azure resources support does not need to handle SAS rules and keys, or any other access tokens. The client app only needs the endpoint address of the Service Bus Messaging namespace. When the app connects, Service Bus binds the managed entity's context to the client in an operation that is shown in an example later in this article. Once it is associated with a managed identity, your Service Bus client can do all authorized operations. Authorization is granted by associating a managed entity with Service Bus roles.
+> [!IMPORTANT]
+> You can disable local or SAS key authentication for a Service Bus namespace and allow only Azure Active Directory authentication. For step-by-step instructions, see [Disable local authentication](disable-local-authentication.md).
+ ## Overview When a security principal (a user, group, or application) attempts to access a Service Bus entity, the request must be authorized. With Azure AD, access to a resource is a two-step process.
service-bus-messaging Service Bus Sas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/service-bus-messaging/service-bus-sas.md
Title: Azure Service Bus access control with Shared Access Signatures description: Overview of Service Bus access control using Shared Access Signatures overview, details about SAS authorization with Azure Service Bus. Previously updated : 10/18/2021 Last updated : 01/06/2022 ms.devlang: csharp
SAS guards access to Service Bus based on authorization rules. Those are configu
> Microsoft recommends using Azure AD with your Azure Service Bus applications when possible. For more information, see the following articles: > - [Authenticate and authorize an application with Azure Active Directory to access Azure Service Bus entities](authenticate-application.md). > - [Authenticate a managed identity with Azure Active Directory to access Azure Service Bus resources](service-bus-managed-service-identity.md)
+>
+> You can disable local or SAS key authentication for a Service Bus namespace and allow only Azure AD authentication. For step-by-step instructions, see [Disable local authentication](disable-local-authentication.md).
## Overview of SAS
static-web-apps Assign Roles Microsoft Graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/static-web-apps/assign-roles-microsoft-graph.md
In this tutorial, you learn to:
"userDetailsClaim": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name", "registration": { "openIdIssuer": "https://login.microsoftonline.com/<YOUR_AAD_TENANT_ID>",
- "clientIdSettingName": "AZURE_CLIENT_ID",
- "clientSecretSettingName": "AZURE_CLIENT_SECRET"
+ "clientIdSettingName": "AAD_CLIENT_ID",
+ "clientSecretSettingName": "AAD_CLIENT_SECRET"
}, "login": { "loginParameters": [
In this tutorial, you learn to:
1. Select the **Edit** button to update the file.
-1. Update the *openIdIssuer* value of `https://login.microsoftonline.com/<YOUR_TENANT_ID>` by replacing `<YOUR_AAD_TENANT_ID>` with the directory (tenant) ID of your Azure Active Directory.
+1. Update the *openIdIssuer* value of `https://login.microsoftonline.com/<YOUR_AAD_TENANT_ID>` by replacing `<YOUR_AAD_TENANT_ID>` with the directory (tenant) ID of your Azure Active Directory.
1. Select **Commit directly to the main branch** and select **Commit changes**.
In this tutorial, you learn to:
| Name | Value | ||-|
- | `AZURE_CLIENT_ID` | *Your Active Directory application (client) ID* |
- | `AZURE_CLIENT_SECRET` | *Your Active Directory application client secret value* |
+ | `AAD_CLIENT_ID` | *Your Active Directory application (client) ID* |
+ | `AAD_CLIENT_SECRET` | *Your Active Directory application client secret value* |
1. Select **Save**.
storage File Sync Planning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/storage/file-sync/file-sync-planning.md
A sync group contains one cloud endpoint, or Azure file share, and at least one
> You can make changes to the namespace of any cloud endpoint or server endpoint in the sync group and have your files synced to the other endpoints in the sync group. If you make a change to the cloud endpoint (Azure file share) directly, changes first need to be discovered by an Azure File Sync change detection job. A change detection job is initiated for a cloud endpoint only once every 24 hours. For more information, see [Azure Files frequently asked questions](../files/storage-files-faq.md?toc=%2fazure%2fstorage%2ffilesync%2ftoc.json#afs-change-detection). ### Consider the count of Storage Sync Services needed
-A previous section discusses the core resource to configure for Azure File Sync: a *Storage Sync Service*. A Windows Server can only be registered to one Storage Sync Service. So it is often best to only deploy a single Storage Sync Service and register all servers that it.
+A previous section discusses the core resource to configure for Azure File Sync: a *Storage Sync Service*. A Windows Server can only be registered to one Storage Sync Service. So it is often best to only deploy a single Storage Sync Service and register all servers on it.
Create multiple Storage Sync Services only if you have: * distinct sets of servers that must never exchange data with one another. In this case, you want to design the system to exclude certain sets of servers to sync with an Azure file share that is already in use as a cloud endpoint in a sync group in a different Storage Sync Service. Another way to look at this is that Windows Servers registered to different storage sync service cannot sync with the same Azure file share.
storage Storage Files Scale Targets https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/storage/files/storage-files-scale-targets.md
Title: Azure Files scalability and performance targets
-description: Learn about the scalability and performance targets for Azure Files, including the capacity, request rate, and inbound and outbound bandwidth limits.
+description: Learn about the capacity, IOPS, and throughput rates for Azure file shares.
Previously updated : 12/08/2021 Last updated : 01/31/2022
# Azure Files scalability and performance targets [Azure Files](storage-files-introduction.md) offers fully managed file shares in the cloud that are accessible via the SMB and NFS file system protocols. This article discusses the scalability and performance targets for Azure Files and Azure File Sync.
-The scalability and performance targets listed here are high-end targets, but may be affected by other variables in your deployment. For example, the throughput for a file may also be limited by your available network bandwidth, not just the servers hosting your Azure file shares. We strongly recommend testing your usage pattern to determine whether the scalability and performance of Azure Files meet your requirements. We are also committed to increasing these limits over time.
+The targets listed here might be affected by other variables in your deployment. For example, the performance of IO for a file might be impacted by your SMB client's behavior and by your available network bandwidth. You should test your usage pattern to determine whether the scalability and performance of Azure Files meet your requirements. You should also expect these limits will increase over time.
## Applies to | File share type | SMB | NFS |
The scalability and performance targets listed here are high-end targets, but ma
Azure file shares are deployed into storage accounts, which are top-level objects that represent a shared pool of storage. This pool of storage can be used to deploy multiple file shares. There are therefore three categories to consider: storage accounts, Azure file shares, and files. ### Storage account scale targets
-Azure supports multiple types of storage accounts for different storage scenarios customers may have, but there are two main types of storage accounts for Azure Files. Which storage account type you need to create depends on whether you want to create a standard file share or a premium file share:
+There are two main types of storage accounts for Azure Files:
- **General purpose version 2 (GPv2) storage accounts**: GPv2 storage accounts allow you to deploy Azure file shares on standard/hard disk-based (HDD-based) hardware. In addition to storing Azure file shares, GPv2 storage accounts can store other storage resources such as blob containers, queues, or tables. File shares can be deployed into the transaction optimized (default), hot, or cool tiers.
Azure supports multiple types of storage accounts for different storage scenario
| Maximum storage account capacity | 5 PiB<sup>1</sup> | 100 TiB (provisioned) | | Maximum number of file shares | Unlimited | Unlimited, total provisioned size of all shares must be less than max than the max storage account capacity | | Maximum concurrent request rate | 20,000 IOPS<sup>1</sup> | 100,000 IOPS |
-| Maximum ingress | <ul><li>US/Europe: 9,536 MiB/sec<sup>1</sup></li><li>Other regions (LRS/ZRS): 9,536 MiB/sec<sup>1</sup></li><li>Other regions (GRS): 4,768 MiB/sec<sup>1</sup></li></ul> | 4,136 MiB/sec |
-| Maximum egress | 47,683 MiB/sec<sup>1</sup> | 6,204 MiB/sec |
+| Throughput (ingress + egress) for LRS/GRS<br /><ul><li>Australia East</li><li>Central US</li><li>East Asia</li><li>East US 2</li><li>Japan East</li><li>Korea Central</li><li>North Europe</li><li>South Central US</li><li>Southeast Asia</li><li>UK South</li><li>West Europe</li><li>West US</li></ul> | <ul><li>Ingress: 7,152 MiB/sec</li><li>Egress: 14,305 MiB/sec</li></ul> | 10,340 MiB/sec |
+| Throughput (ingress + egress) for ZRS<br /><ul><li>Australia East</li><li>Central US</li><li>East US</li><li>East US 2</li><li>Japan East</li><li>North Europe</li><li>South Central US</li><li>Southeast Asia</li><li>UK South</li><li>West Europe</li><li>West US 2</li></ul> | <ul><li>Ingress: 7,152 MiB/sec</li><li>Egress: 14,305 MiB/sec</li></ul> | 10,340 MiB/sec |
+| Throughput (ingress + egress) for redundancy/region combinations not listed in the previous row | <ul><li>Ingress: 2,980 MiB/sec</li><li>Egress: 5,960 MiB/sec</li></ul> | 10,340 MiB/sec |
| Maximum number of virtual network rules | 200 | 200 | | Maximum number of IP address rules | 200 | 200 | | Management read operations | 800 per 5 minutes | 800 per 5 minutes |
Azure supports multiple types of storage accounts for different storage scenario
<sup>2 Subject to machine network limits, available bandwidth, IO sizes, queue depth, and other factors. For details see [SMB Multichannel performance](./storage-files-smb-multichannel-performance.md).</sup> ## Azure File Sync scale targets
-The following table indicates the boundaries of Microsoft's testing (soft limits) and also indicates which targets are hard limits:
+The following table indicates which target are soft, representing the Microsoft tested boundary, and hard, indicating an enforced maximum:
| Resource | Target | Hard limit | |-|--||
The following table indicates the boundaries of Microsoft's testing (soft limits
> An Azure File Sync endpoint can scale up to the size of an Azure file share. If the Azure file share size limit is reached, sync will not be able to operate. ### Azure File Sync performance metrics
-Since the Azure File Sync agent runs on a Windows Server machine that connects to the Azure file shares, the effective sync performance depends upon a number of factors in your infrastructure: Windows Server and the underlying disk configuration, network bandwidth between the server and the Azure storage, file size, total dataset size, and the activity on the dataset. Since Azure File Sync works on the file level, the performance characteristics of an Azure File Sync-based solution is better measured in the number of objects (files and directories) processed per second.
+Since the Azure File Sync agent runs on a Windows Server machine that connects to the Azure file shares, the effective sync performance depends upon a number of factors in your infrastructure: Windows Server and the underlying disk configuration, network bandwidth between the server and the Azure storage, file size, total dataset size, and the activity on the dataset. Since Azure File Sync works on the file level, the performance characteristics of an Azure File Sync-based solution should be measured by the number of objects (files and directories) processed per second.
For Azure File Sync, performance is critical in two stages:
storage Storage Troubleshoot Windows File Connection Problems https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/storage/files/storage-troubleshoot-windows-file-connection-problems.md
description: Troubleshooting Azure Files problems in Windows. See common issues
Previously updated : 09/13/2019 Last updated : 01/31/2022 # Troubleshoot Azure Files problems in Windows (SMB)
-This article lists common problems that are related to Microsoft Azure Files when you connect from Windows clients. It also provides possible causes and resolutions for these problems. In addition to the troubleshooting steps in this article, you can also use [AzFileDiagnostics](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) to ensure that the Windows client environment has correct prerequisites. AzFileDiagnostics automates detection of most of the symptoms mentioned in this article and helps set up your environment to get optimal performance.
+This article lists common problems that are related to Microsoft Azure Files when you connect from Windows clients. It also provides possible causes and resolutions for these problems. In addition to the troubleshooting steps in this article, you can also use [`AzFileDiagnostics`](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) to ensure that the Windows client environment has correct prerequisites. `AzFileDiagnostics` automates detection of most of the symptoms mentioned in this article and helps set up your environment to get optimal performance.
> [!IMPORTANT] > The content of this article only applies to SMB shares. For details on NFS shares, see [Troubleshoot Azure NFS file shares](storage-troubleshooting-files-nfs.md).
When you try to mount a file share, you might receive the following error:
### Cause 1: Unencrypted communication channel
-For security reasons, connections to Azure file shares are blocked if the communication channel isn't encrypted and if the connection attempt isn't made from the same datacenter where the Azure file shares reside. Unencrypted connections within the same datacenter can also be blocked if the [Secure transfer required](../common/storage-require-secure-transfer.md) setting is enabled on the storage account. An encrypted communication channel is provided only if the user's client OS supports SMB encryption.
+For security reasons, connections to Azure file shares are blocked if the communication channel isn't encrypted and if the connection attempt isn't made from the same datacenter where the Azure file shares reside. If the [Secure transfer required](../common/storage-require-secure-transfer.md) setting is enabled on the storage account, unencrypted connections within the same datacenter are also blocked. An encrypted communication channel is provided only if the end-user's client OS supports SMB encryption.
Windows 8, Windows Server 2012, and later versions of each system negotiate requests that include SMB 3.x, which supports encryption. ### Solution for cause 1
-1. Connect from a client that supports SMB encryption (Windows 8, Windows Server 2012 or later) or connect from a virtual machine in the same datacenter as the Azure storage account that is used for the Azure file share.
-2. Verify the [Secure transfer required](../common/storage-require-secure-transfer.md) setting is disabled on the storage account if the client does not support SMB encryption.
+1. Connect from a client that supports SMB encryption (Windows 8/Windows Server 2012 or later).
+2. Connect from a virtual machine in the same datacenter as the Azure storage account that is used for the Azure file share.
+3. Verify the [Secure transfer required](../common/storage-require-secure-transfer.md) setting is disabled on the storage account if the client does not support SMB encryption.
### Cause 2: Virtual network or firewall rules are enabled on the storage account -
-If virtual network (VNET) and firewall rules are configured on the storage account, network traffic will be denied access unless the client IP address or virtual network is allowed access.
+Network traffic is denied if virtual network (VNET) and firewall rules are configured on the storage account, unless the client IP address or virtual network is allow listed.
### Solution for cause 2
-Verify virtual network and firewall rules are configured properly on the storage account. To test if virtual network or firewall rules is causing the issue, temporarily change the setting on the storage account to **Allow access from all networks**. To learn more, see [Configure Azure Storage firewalls and virtual networks](../common/storage-network-security.md).
+Verify that virtual network and firewall rules are configured properly on the storage account. To test if virtual network or firewall rules is causing the issue, temporarily change the setting on the storage account to **Allow access from all networks**. To learn more, see [Configure Azure Storage firewalls and virtual networks](../common/storage-network-security.md).
### Cause 3: Share-level permissions are incorrect when using identity-based authentication
-If users are accessing the Azure file share using Active Directory (AD) or Azure Active Directory Domain Services (Azure AD DS) authentication, access to the file share will fail with "Access is denied" error if share-level permissions are incorrect.
+If end-users are accessing the Azure file share using Active Directory (AD) or Azure Active Directory Domain Services (Azure AD DS) authentication, access to the file share fails with "Access is denied" error if share-level permissions are incorrect.
### Solution for cause 3
When you try to mount a file share from on-premises or from a different datacent
System error 53 or system error 67 can occur if port 445 outbound communication to an Azure Files datacenter is blocked. To see the summary of ISPs that allow or disallow access from port 445, go to [TechNet](https://social.technet.microsoft.com/wiki/contents/articles/32346.azure-summary-of-isps-that-allow-disallow-access-from-port-445.aspx).
-To check if your firewall or ISP is blocking port 445, use the [AzFileDiagnostics](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) tool or `Test-NetConnection` cmdlet.
+To check if your firewall or ISP is blocking port 445, use the [`AzFileDiagnostics`](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) tool or `Test-NetConnection` cmdlet.
To use the `Test-NetConnection` cmdlet, the Azure PowerShell module must be installed, see [Install Azure PowerShell module](/powershell/azure/install-Az-ps) for more information. Remember to replace `<your-storage-account-name>` and `<your-resource-group-name>` with the relevant names for your storage account.
TcpTestSucceeded : True
### Solution for cause 1
-#### Solution 1 - Use Azure File Sync
+#### Solution 1 ΓÇö Use Azure File Sync
Azure File Sync can transform your on-premises Windows Server into a quick cache of your Azure file share. You can use any protocol that's available on Windows Server to access your data locally, including SMB, NFS, and FTPS. Azure File Sync works over port 443 and can thus be used as a workaround to access Azure Files from clients that have port 445 blocked. [Learn how to setup Azure File Sync](../file-sync/file-sync-extend-servers.md).
-#### Solution 2 - Use VPN
+#### Solution 2 ΓÇö Use VPN
By Setting up a VPN to your specific Storage Account, the traffic will go through a secure tunnel as opposed to over the internet. Follow the [instructions to setup VPN](storage-files-configure-p2s-vpn-windows.md) to access Azure Files from Windows.
-#### Solution 3 - Unblock port 445 with help of your ISP/IT Admin
+#### Solution 3 ΓÇö Unblock port 445 with help of your ISP/IT Admin
Work with your IT department or ISP to open port 445 outbound to [Azure IP ranges](https://www.microsoft.com/download/details.aspx?id=41653).
-#### Solution 4 - Use REST API based tools like Storage Explorer/Powershell
-Azure Files also supports REST in addition to SMB. REST access works over port 443 (standard tcp). There are various tools that are written using REST API which enable rich UI experience. [Storage Explorer](../../vs-azure-tools-storage-manage-with-storage-explorer.md?tabs=windows) is one of them. [Download and Install Storage Explorer](https://azure.microsoft.com/features/storage-explorer/) and connect to your file share backed by Azure Files. You can also use [PowerShell](./storage-how-to-use-files-portal.md) which also user REST API.
+#### Solution 4 ΓÇö Use REST API-based tools like Storage Explorer/Powershell
+Azure Files also supports REST in addition to SMB. REST access works over port 443 (standard tcp). There are various tools that are written using REST API that enable rich UI experience. [Storage Explorer](../../vs-azure-tools-storage-manage-with-storage-explorer.md?tabs=windows) is one of them. [Download and Install Storage Explorer](https://azure.microsoft.com/features/storage-explorer/) and connect to your file share backed by Azure Files. You can also use [PowerShell](./storage-how-to-use-files-portal.md) which also user REST API.
### Cause 2: NTLMv1 is enabled
To close open handles for a file share, directory or file, use the [Close-AzStor
<a id="noaaccessfailureportal"></a> ## Error "No access" when you try to access or delete an Azure File Share
-When you try to access or delete an Azure file share in the portal, you may receive the following error:
+When you try to access or delete an Azure file share in the portal, you might receive the following error:
No access Error code: 403
Browse to the storage account where the Azure file share is located, click **Acc
## Unable to modify or delete an Azure file share (or share snapshots) because of locks or leases Azure Files provides two ways to prevent accidental modification or deletion of Azure file shares and share snapshots: -- **Storage account resource locks**: All Azure resources, including the storage account, support [resource locks](../../azure-resource-manager/management/lock-resources.md). Locks may be put on the storage account by an administrator, or by value-added services such as Azure Backup. Two variations of resource locks exist: modify, which prevents all modifications to the storage account and its resources, and delete, which only prevent deletes of the storage account and its resources. When modifying or deleting shares through the `Microsoft.Storage` resource provider, resource locks will be enforced on Azure file shares and share snapshots. Most portal operations, Azure PowerShell cmdlets for Azure Files with `Rm` in the name (i.e. `Get-AzRmStorageShare`), and Azure CLI commands in the `share-rm` command group (i.e. `az storage share-rm list`) use the `Microsoft.Storage` resource provider. Some tools and utilities such as Storage Explorer, legacy Azure Files PowerShell management cmdlets without `Rm` in the name (i.e. `Get-AzStorageShare`), and legacy Azure Files CLI commands under the `share` command group (i.e. `az storage share list`) use legacy APIs in the FileREST API that bypass the `Microsoft.Storage` resource provider and resource locks. For more information on legacy management APIs exposed in the FileREST API, see [control plane in Azure Files](/rest/api/storageservices/file-service-rest-api#control-plane).
+- **Storage account resource locks**: All Azure resources, including the storage account, support [resource locks](../../azure-resource-manager/management/lock-resources.md). Locks might put on the storage account by an administrator, or by value-added services such as Azure Backup. Two variations of resource locks exist: modify, which prevents all modifications to the storage account and its resources, and delete, which only prevent deletes of the storage account and its resources. When modifying or deleting shares through the `Microsoft.Storage` resource provider, resource locks are enforced on Azure file shares and share snapshots. Most portal operations, Azure PowerShell cmdlets for Azure Files with `Rm` in the name (i.e. `Get-AzRmStorageShare`), and Azure CLI commands in the `share-rm` command group (i.e. `az storage share-rm list`) use the `Microsoft.Storage` resource provider. Some tools and utilities such as Storage Explorer, legacy Azure Files PowerShell management cmdlets without `Rm` in the name (i.e. `Get-AzStorageShare`), and legacy Azure Files CLI commands under the `share` command group (i.e. `az storage share list`) use legacy APIs in the FileREST API that bypass the `Microsoft.Storage` resource provider and resource locks. For more information on legacy management APIs exposed in the FileREST API, see [control plane in Azure Files](/rest/api/storageservices/file-service-rest-api#control-plane).
-- **Share/share snapshot leases**: Share leases are a kind of proprietary lock for Azure file shares and file share snapshots. Leases may be put on individual Azure file shares or file share snapshots by administrators by calling the API through a script, or by value-added services such as Azure Backup. When a lease is put on an Azure file share or file share snapshot, modifying or deleting the file share/share snapshot can be done with the *lease ID*. Users can also release the lease before modification operations, which requires the lease ID, or break the lease, which does not require the lease ID. For more information on share leases, see [lease share](/rest/api/storageservices/lease-share).
+- **Share/share snapshot leases**: Share leases are a kind of proprietary lock for Azure file shares and file share snapshots. Leases might be put on individual Azure file shares or file share snapshots by administrators by calling the API through a script, or by value-added services such as Azure Backup. When a lease is put on an Azure file share or file share snapshot, modifying or deleting the file share/share snapshot can be done with the *lease ID*. Admins can also release the lease before modification operations, which requires the lease ID, or break the lease, which does not require the lease ID. For more information on share leases, see [lease share](/rest/api/storageservices/lease-share).
-Since resource locks and leases may interfere with intended administrator operations on your storage account/Azure file shares, you may wish to remove any resource locks/leases that may have been put on your resources manually or automatically by value-added services such as Azure Backup. The following script will remove all resource locks and leases. Remember to replace `<resource-group>` and `<storage-account>` with the appropriate values for your environment.
+Since resource locks and leases might interfere with intended administrator operations on your storage account/Azure file shares, you might wish to remove any resource locks/leases that have been put on your resources manually or automatically by value-added services such as Azure Backup. The following script removes all resource locks and leases. Remember to replace `<resource-group>` and `<storage-account>` with the appropriate values for your environment.
To run the following script, you must [install the 3.10.1-preview version](https://www.powershellgallery.com/packages/Az.Storage/3.10.1-preview) of the Azure Storage PowerShell module.
When you open a file from a mounted Azure file share over SMB, your application/
Although as a stateless protocol, the FileREST protocol does not have a concept of file handles, it does provide a similar mechanism to mediate access to files and folders that your script, application, or service may use: file leases. When a file is leased, it is treated as equivalent to a file handle with a file sharing mode of `None`.
-Although file handles and leases serve an important purpose, sometimes file handles and leases may be orphaned. When this happens, this can cause problems modifying or deleting files. You may see error messages like:
+Although file handles and leases serve an important purpose, sometimes file handles and leases might be orphaned. When this happens, this can cause problems modifying or deleting files. You may see error messages like:
- The process cannot access the file because it is being used by another process. - The action can't be completed because the file is open in another program.
To force a file handle to be closed, use the [Close-AzStorageFileHandle](/powers
> The Get-AzStorageFileHandle and Close-AzStorageFileHandle cmdlets are included in Az PowerShell module version 2.4 or later. To install the latest Az PowerShell module, see [Install the Azure PowerShell module](/powershell/azure/install-az-ps). ### Cause 2
-A file lease is prevent a file from being modified or deleted. You can check if a file has a file lease with the following PowerShell, replacing `<resource-group>`, `<storage-account>`, `<file-share>`, and `<path-to-file>` with the appropriate values for your environment:
+A file lease is preventing a file from being modified or deleted. You can check if a file has a file lease with the following PowerShell, replacing `<resource-group>`, `<storage-account>`, `<file-share>`, and `<path-to-file>` with the appropriate values for your environment:
```PowerShell # Set variables
Use one of the following solutions:
- Use the cmdkey command to add the credentials into Credential Manager. Perform this from a command line under the service account context, either through an interactive login or by using `runas`. `cmdkey /add:<storage-account-name>.file.core.windows.net /user:AZURE\<storage-account-name> /pass:<storage-account-key>`-- Map the share directly without using a mapped drive letter. Some applications may not reconnect to the drive letter properly, so using the full UNC path may be more reliable.
+- Map the share directly without using a mapped drive letter. Some applications may not reconnect to the drive letter properly, so using the full UNC path might more reliable.
`net use * \\storage-account-name.file.core.windows.net\share`
This problem can occur if there is no enough cache on client machine for large d
To resolve this problem, adjusting the **DirectoryCacheEntrySizeMax** registry value to allow caching of larger directory listings in the client machine: -- Location: HKLM\System\CCS\Services\Lanmanworkstation\Parameters-- Value mane: DirectoryCacheEntrySizeMax -- Value type:DWORD
+- Location: `HKLM\System\CCS\Services\Lanmanworkstation\Parameters`
+- Value name: `DirectoryCacheEntrySizeMax`
+- Value type: `DWORD`
-
-For example, you can set it to 0x100000 and see if the performance become better.
+For example, you can set it to `0x100000` and see if the performance improves.
## Error AadDsTenantNotFound in enabling Azure Active Directory Domain Service (Azure AD DS) authentication for Azure Files "Unable to locate active tenants with tenant ID aad-tenant-id"
Enable Azure AD DS on the Azure AD tenant of the subscription that your storage
### Self diagnostics steps First, make sure that you have followed through all four steps to [enable Azure Files AD Authentication](./storage-files-identity-auth-active-directory-enable.md).
-Second, try [mounting Azure file share with storage account key](./storage-how-to-use-files-windows.md). If you failed to mount, download [AzFileDiagnostics.ps1](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) to help you validate the client running environment, detect the incompatible client configuration which would cause access failure for Azure Files, gives prescriptive guidance on self-fix and, collect the diagnostics traces.
+Second, try [mounting Azure file share with storage account key](./storage-how-to-use-files-windows.md). If you failed to mount, download [`AzFileDiagnostics`](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) to help you validate the client running environment, detect the incompatible client configuration which would cause access failure for Azure Files, gives prescriptive guidance on self-fix and, collect the diagnostics traces.
Third, you can run the Debug-AzStorageAccountAuth cmdlet to conduct a set of basic checks on your AD configuration with the logged on AD user. This cmdlet is supported on [AzFilesHybrid v0.1.2+ version](https://github.com/Azure-Samples/azure-files-samples/releases). You need to run this cmdlet with an AD user that has owner permission on the target storage account. ```PowerShell
The cmdlet performs these checks below in sequence and provides guidance for fai
1. CheckADObjectPasswordIsCorrect: Ensure that the password configured on the AD identity that represents the storage account is matching that of the storage account kerb1 or kerb2 key. If the password is incorrect, you can run [Update-AzStorageAccountADObjectPassword](./storage-files-identity-ad-ds-update-password.md) to reset the password. 2. CheckADObject: Confirm that there is an object in the Active Directory that represents the storage account and has the correct SPN (service principal name). If the SPN isn't correctly setup, please run the Set-AD cmdlet returned in the debug cmdlet to configure the SPN. 3. CheckDomainJoined: Validate that the client machine is domain joined to AD. If your machine is not domain joined to AD, please refer to this [article](/windows-server/identity/ad-fs/deployment/join-a-computer-to-a-domain) for domain join instruction.
-4. CheckPort445Connectivity: Check that Port 445 is opened for SMB connection. If the required Port is not open, please refer to the troubleshooting tool [AzFileDiagnostics.ps1](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) for connectivity issues with Azure Files.
+4. CheckPort445Connectivity: Check that Port 445 is opened for SMB connection. If the required Port is not open, please refer to the troubleshooting tool [`AzFileDiagnostics`](https://github.com/Azure-Samples/azure-files-samples/tree/master/AzFileDiagnostics/Windows) for connectivity issues with Azure Files.
5. CheckSidHasAadUser: Check that the logged on AD user is synced to Azure AD. If you want to look up whether a specific AD user is synchronized to Azure AD, you can specify the -UserName and -Domain in the input parameters. 6. CheckGetKerberosTicket: Attempt to get a Kerberos ticket to connect to the storage account. If there isn't a valid Kerberos token, run the klist get cifs/storage-account-name.file.core.windows.net cmdlet and examine the error code to root-cause the ticket retrieval failure. 7. CheckStorageAccountDomainJoined: Check if the AD authentication has been enabled and the account's AD properties are populated. If not, refer to the instruction [here](./storage-files-identity-ad-ds-enable.md) to enable AD DS authentication on Azure Files.
-8. CheckUserRbacAssignment: Check if the AD user has the proper RBAC role assignment to provide share level permission to access Azure Files. If not, refer to the instruction [here](./storage-files-identity-ad-ds-assign-permissions.md) to configure the share level permission. (Supported on AzFilesHybrid v0.2.3+ version)
-9. CheckUserFileAccess: Check if the AD user has the proper directory/file permission (Windows ACLs) to access Azure Files. If not, refer to the instruction [here](./storage-files-identity-ad-ds-configure-permissions.md) to configure the directory/file level permission. (Supported on AzFilesHybrid v0.2.3+ version)
+8. CheckUserRbacAssignment: Check if the AD identity has the proper RBAC role assignment to provide share level permission to access Azure Files. If not, refer to the instruction [here](./storage-files-identity-ad-ds-assign-permissions.md) to configure the share level permission. (Supported on AzFilesHybrid v0.2.3+ version)
+9. CheckUserFileAccess: Check if the AD identity has the proper directory/file permission (Windows ACLs) to access Azure Files. If not, refer to the instruction [here](./storage-files-identity-ad-ds-configure-permissions.md) to configure the directory/file level permission. (Supported on AzFilesHybrid v0.2.3+ version)
## Unable to configure directory/file level permissions (Windows ACLs) with Windows File Explorer
This error is most likely triggered by a syntax error in the Join-AzStorageAccou
## Azure Files on-premises AD DS Authentication support for AES 256 Kerberos encryption
-We introduced AES 256 Kerberos encryption support for Azure Files on-prem AD DS authentication with [AzFilesHybrid module v0.2.2](https://github.com/Azure-Samples/azure-files-samples/releases). If you have enabled AD DS authentication with a module version lower than v0.2.2, you will need to download the latest AzFilesHybrid module (v0.2.2+) and run the PowerShell below. If you have not enabled AD DS authentication on your storage account yet, you can follow this [guidance](./storage-files-identity-ad-ds-enable.md#option-one-recommended-use-azfileshybrid-powershell-module) for enablement.
+Azure Files supports AES 256 Kerberos encryption support for AD DS authentication with the [AzFilesHybrid module v0.2.2](https://github.com/Azure-Samples/azure-files-samples/releases). If you have enabled AD DS authentication with a module version lower than v0.2.2, you will need to download the latest AzFilesHybrid module (v0.2.2+) and run the PowerShell below. If you have not enabled AD DS authentication on your storage account yet, you can follow this [guidance](./storage-files-identity-ad-ds-enable.md#option-one-recommended-use-azfileshybrid-powershell-module) for enablement.
```PowerShell $ResourceGroupName = "<resource-group-name-here>"
$StorageAccountName = "<storage-account-name-here>"
Update-AzStorageAccountAuthForAES256 -ResourceGroupName $ResourceGroupName -StorageAccountName $StorageAccountName ```
+## User identity formerly having the Owner or Contributor role assignment still has storage account key access
+The storage account Owner and Contributor roles grant the ability to list the storage account keys. The storage account key enables full access to the storage account's data including file shares, blob containers, tables, and queues, and limited access to the Azure Files management operations via the legacy management APIs exposed through the FileREST API. If you're changing role assignments, you should consider that the users being removed from the Owner or Contributor roles may continue to maintain access to the storage account through saved storage account keys.
+
+### Solution 1
+You can remedy this issue easily by rotating the storage account keys. We recommend rotating the keys one at a time, switching access from one to the other as they are rotated. There are two types of shared keys the storage account provides: the storage account keys, which provide super-administrator access to the storage account's data, and the Kerberos keys, which function as a shared secret between the storage account and the Windows Server Active Directory domain controller for Windows Server Active Directory scenarios.
+
+To rotate the Kerberos keys of a storage account, see [Update the password of your storage account identity in AD DS](./storage-files-identity-ad-ds-update-password.md).
+
+# [Portal](#tab/azure-portal)
+Navigate to the desired storage account in the Azure portal. In the table of contents for the desired storage account, select **Access keys** under the **Security + networking** heading. In the *Access key** pane, select **Rotate key** above the desired key.
+
+![A screenshot of the access key pane](./media/storage-troubleshoot-windows-file-connection-problems/access-keys-1.png)
+
+# [PowerShell](#tab/azure-powershell)
+The following script will rotate both keys for the storage account. If you desire to swap out keys during rotation, you will need to provide additional logic in your script to handle this scenario. Remember to replace `<resource-group>` and `<storage-account>` with the appropriate values for your environment.
+
+```PowerShell
+$resourceGroupName = "<resource-group>"
+$storageAccountName = "<storage-account>"
+
+# Rotate primary key (key 1). You should switch to key 2 before rotating key 1.
+New-AzStorageAccountKey `
+ -ResourceGroupName $resourceGroupName `
+ -Name $storageAccountName `
+ -KeyName "key1"
+
+# Rotate secondary key (key 2). You should switch to the new key 1 before rotating key 2.
+New-AzStorageAccountKey `
+ -ResourceGroupName $resourceGroupName `
+ -Name $storageAccountName `
+ -KeyName "key2"
+```
+
+# [Azure CLI](#tab/azure-cli)
+The following script will rotate both keys for the storage account. If you desire to swap out keys during rotation, you will need to provide additional logic in your script to handle this scenario. Remember to replace `<resource-group>` and `<storage-account>` with the appropriate values for your environment.
+
+```bash
+resourceGroupName="<resource-group>"
+storageAccountName="<storage-account>"
+
+# Rotate primary key (key 1). You should switch to key 2 before rotating key 1.
+az storage account keys renew \
+ --resource-group $resourceGroupName \
+ --account-name $storageAccountName \
+ --key "primary"
+
+# Rotate secondary key (key 2). You should switch to the new key 1 before rotating key 2.
+az storage account keys renew \
+ --resource-group $resourceGroupName \
+ --account-name $storageAccountName \
+ --key "secondary"
+```
+++ ## Need help? Contact support. If you still need help, [contact support](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade) to get your problem resolved quickly.
virtual-machines Custom Script Linux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/extensions/custom-script-linux.md
az vm extension set \
If you deploy the Custom Script Extension from the Azure portal, you don't have control over the expiration of the SAS token for accessing the script in your storage account. The result is that the initial deployment works, but when the storage account's SAS token expires, any subsequent scaling operation fails because the Custom Script Extension can no longer access the storage account.
-We recommend that you use [PowerShell](/powershell/module/az.Compute/Add-azVmssExtension?view=azps-7.0.0), the [Azure CLI](/cli/azure/vmss/extension?view=azure-cli-latest), or an Azure Resource Manager template when you deploy the Custom Script Extension on a virtual machine scale set. This way, you can choose to use a managed identity or have direct control of the expiration of the SAS token for accessing the script in your storage account for as long as you need.
+We recommend that you use [PowerShell](/powershell/module/az.Compute/Add-azVmssExtension?view=azps-7.0.0), the [Azure CLI](/cli/azure/vmss/extension), or an Azure Resource Manager template when you deploy the Custom Script Extension on a virtual machine scale set. This way, you can choose to use a managed identity or have direct control of the expiration of the SAS token for accessing the script in your storage account for as long as you need.
## Troubleshooting When the Custom Script Extension runs, the script is created or downloaded into a directory that's similar to the following example. The command output is also saved into this directory in `stdout` and `stderr` files.
virtual-machines Custom Script Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/extensions/custom-script-windows.md
The response content cannot be parsed because the Internet Explorer engine is no
If you deploy the Custom Script Extension from the Azure portal, you don't have control over the expiration of the SAS token for accessing the script in your storage account. The result is that the initial deployment works, but when the storage account's SAS token expires, any subsequent scaling operation fails because the Custom Script Extension can no longer access the storage account.
-We recommend that you use [PowerShell](/powershell/module/az.Compute/Add-azVmssExtension?view=azps-7.0.0), the [Azure CLI](/cli/azure/vmss/extension?view=azure-cli-latest), or an Azure Resource Manager template when you deploy the Custom Script Extension on a virtual machine scale set. This way, you can choose to use a managed identity or have direct control of the expiration of the SAS token for accessing the script in your storage account for as long as you need.
+We recommend that you use [PowerShell](/powershell/module/az.Compute/Add-azVmssExtension?view=azps-7.0.0), the [Azure CLI](/cli/azure/vmss/extension), or an Azure Resource Manager template when you deploy the Custom Script Extension on a virtual machine scale set. This way, you can choose to use a managed identity or have direct control of the expiration of the SAS token for accessing the script in your storage account for as long as you need.
## Classic VMs
virtual-machines Key Vault Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/extensions/key-vault-windows.md
The Key Vault VM extension supports below versions of Windows:
The Key Vault VM extension is also supported on custom local VM that is uploaded and converted into a specialized image for use in Azure using Windows Server 2019 core install.
+> [!NOTE]
+> The Key Vault VM extension downloads all the certificates in the windows certificate store or the to the location provided by ΓÇ£certificateStoreLocationΓÇ¥ property in the VM extension settings. Currently, the KV VM extension grants access to the private key of the certificate only to the local system admin account. Additionally, it is currently not possible to define certificate store location per certificate. The VM extension team is working on a solution to close this feature gap.
++ ### Supported certificate content types - PKCS #12
virtual-machines Tutorial Build Deploy Azure Pipelines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/linux/tutorial-build-deploy-azure-pipelines.md
For more guidance, follow the steps in [Build your Node.js app with gulp](/azure
- deployment: VMDeploy displayName: web pool:
- vmImage: 'Ubuntu-16.04'
+ vmImage: 'Ubuntu-latest'
environment: name: <environment name> resourceType: VirtualMachine
virtual-machines Managed Disks Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/managed-disks-overview.md
description: Overview of Azure managed disks, which handle the storage accounts
Previously updated : 11/02/2021 Last updated : 02/01/2022
A data disk is a managed disk that's attached to a virtual machine to store appl
Every virtual machine has one attached operating system disk. That OS disk has a pre-installed OS, which was selected when the VM was created. This disk contains the boot volume.
-This disk has a maximum capacity of 4,095 GiB.
+This disk has a maximum capacity of 4,095 GiB, however, most deployments use [master boot record (MBR)](https://wikipedia.org/wiki/Master_boot_record) by default. MBR limits the usable size to 2 TiB. If you need more than 2 TiB, create and attach [data disks](#data-disk) and use them for data storage. If you need to store data on the OS disk and require the additional space, [convert it to GUID Partition Table](/windows-server/storage/disk-management/change-an-mbr-disk-into-a-gpt-disk) (GPT). To learn about the differences between MBR and GPT on Windows deployments, see [Windows and GPT FAQ](/windows-hardware/manufacture/desktop/windows-and-gpt-faq?view=windows-11).
### Temporary disk
virtual-machines Disks Upload Vhd To Managed Disk Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/windows/disks-upload-vhd-to-managed-disk-powershell.md
Title: Upload a VHD to Azure or copy a disk across regions - Azure PowerShell
description: Learn how to upload a VHD to an Azure managed disk and copy a managed disk across regions, using Azure PowerShell, via direct upload. Previously updated : 09/07/2021 Last updated : 02/01/2022 linux
**Applies to:** :heavy_check_mark: Windows VMs
+This article explains how to either upload a VHD from your local machine to an Azure managed disk or copy a managed disk to another region, using the Azure PowerShell module. The process of uploading a managed disk, also known as direct upload, enables you to upload a VHD up to 32 TiB in size directly into a managed disk. Currently, direct upload is supported for standard HDD, standard SSD, and premium SSDs. It isn't supported for ultra disks, yet.
-## Prerequisites
--- Download the latest [version of AzCopy v10](../../storage/common/storage-use-azcopy-v10.md#download-and-install-azcopy).-- [Install Azure PowerShell module](/powershell/azure/install-Az-ps).-- If you intend to upload a VHD from on-premises: A fixed size VHD that [has been prepared for Azure](prepare-for-upload-vhd-image.md), stored locally.-- Or, a managed disk in Azure, if you intend to perform a copy action.
+If you're providing a backup solution for IaaS VMs in Azure, you should use direct upload to restore customer backups to managed disks. When uploading a VHD from a source external to Azure, speeds depend on your local bandwidth. When uploading or copying from an Azure VM, your bandwidth would be the same as standard HDDs.
## Getting started
-If you'd prefer to upload disks through a GUI, you can do so using Azure Storage Explorer. For details refer to: [Use Azure Storage Explorer to manage Azure managed disks](../disks-use-storage-explorer-managed-disks.md)
+There are two ways you can upload a VHD with the Azure PowerShell module: You can either use the [Add-AzVHD](/powershell/module/az.compute/add-azvhd?view=azps-7.1.0&viewFallbackFrom=azps-5.4.0) command, which will automate most of the process for you, or you can perform the upload manually with AzCopy.
+
+Generally, you should use [Add-AzVHD](#use-add-azvhd). However, if you need to upload a VHD that is larger than 50 GiB, consider [uploading the VHD manually with AzCopy](#manual-upload). VHDs 50 GiB and larger will upload faster using AzCopy.
+
+For guidance on how to copy a managed disk from one region to another, see [Copy a managed disk](#copy-a-managed-disk).
+
+## Use Add-AzVHD
+
+### Prerequisites
+
+- [Install the Azure PowerShell module](/powershell/azure/install-Az-ps).
+- A VHD [has been prepared for Azure](prepare-for-upload-vhd-image.md), stored locally.
+ - On Windows: You don't need to convert your VHD to VHDx, convert it a fixed size, or resize it to include the 512-byte offset. `Add-AZVHD` performs these functions for you.
+ - [Hyper-V](/windows-server/virtualization/hyper-v/hyper-v-technology-overview) must be enabled for Add-AzVHD to perform these functions.
+ - On Linux: You must perform these actions manually. See [Resizing VHDs](/azure/virtual-machines/linux/create-upload-generic?branch=pr-en-us-185925) for details.
+
+### Upload a VHD
+
+The following example uploads a VHD from your local machine to a new Azure managed disk using [Add-AzVHD](/powershell/module/az.compute/add-azvhd?view=azps-7.1.0&viewFallbackFrom=azps-5.4.0). Replace `<your-filepath-here>`, `<your-resource-group-name>`,`<desired-region>`, and `<desired-managed-disk-name>` with your parameters:
+
+```azurepowershell
+# Required parameters
+$path = <your-filepath-here>.vhd
+$resourceGroup = <your-resource-group-name>
+$location = <desired-region>
+$name = <desired-managed-disk-name>
+
+# Optional parameters
+# $Zone = <desired-zone>
+# $sku=<desired-SKU>
+
+# To use $Zone or #sku, add -Zone or -DiskSKU parameters to the command
+Add-AzVhd -LocalFilePath $path -ResourceGroupName $resourceGroup -Location $location -DiskName $name
+```
+
+## Manual upload
+
+### Prerequisites
+
+- Download the latest [version of AzCopy v10](../../storage/common/storage-use-azcopy-v10.md#download-and-install-azcopy).
+- [Install the Azure PowerShell module](/powershell/azure/install-Az-ps).
+- A fixed size VHD that [has been prepared for Azure](prepare-for-upload-vhd-image.md), stored locally.
To upload your VHD to Azure, you'll need to create an empty managed disk that is configured for this upload process. Before you create one, there's some additional information you should know about these disks.
This kind of managed disk has two unique states:
> [!NOTE] > While in either of these states, the managed disk will be billed at [standard HDD pricing](https://azure.microsoft.com/pricing/details/managed-disks/), regardless of the actual type of disk. For example, a P10 will be billed as an S10. This will be true until `revoke-access` is called on the managed disk, which is required in order to attach the disk to a VM.
-## Create an empty managed disk
+### Create an empty managed disk
Before you can create an empty standard HDD for uploading, you'll need the file size of the VHD you want to upload, in bytes. The example code will get that for you but, to do it yourself you can use: `$vhdSizeBytes = (Get-Item "<fullFilePathHere>").length`. This value is used when specifying the **-UploadSizeInBytes** parameter.
$diskSas = Grant-AzDiskAccess -ResourceGroupName '<yourresourcegroupname>' -Disk
$disk = Get-AzDisk -ResourceGroupName '<yourresourcegroupname>' -DiskName '<yourdiskname>' ```
-## Upload a VHD
+### Upload a VHD
Now that you have a SAS for your empty managed disk, you can use it to set your managed disk as the destination for your upload command.
Revoke-AzDiskAccess -ResourceGroupName '<yourresourcegroupname>' -DiskName '<you
Direct upload also simplifies the process of copying a managed disk. You can either copy within the same region or copy your managed disk to another region.
-The follow script will do this for you, the process is similar to the steps described earlier, with some differences, since you're working with an existing disk.
+The following script will do this for you, the process is similar to the steps described earlier, with some differences, since you're working with an existing disk.
> [!IMPORTANT]
-> You need to add an offset of 512 when you're providing the disk size in bytes of a managed disk from Azure. This is because Azure omits the footer when returning the disk size. The copy will fail if you do not do this. The following script already does this for you.
+> You must add an offset of 512 when you're providing the disk size in bytes of a managed disk from Azure. This is because Azure omits the footer when returning the disk size. The copy will fail if you don't do this. The following script already does this for you.
Replace the `<sourceResourceGroupHere>`, `<sourceDiskNameHere>`, `<targetDiskNameHere>`, `<targetResourceGroupHere>`, `<yourOSTypeHere>` and `<yourTargetLocationHere>` (an example of a location value would be uswest2) with your values, then run the following script in order to copy a managed disk.
virtual-machines Prepare For Upload Vhd Image https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/windows/prepare-for-upload-vhd-image.md
generalized disk, see
## Convert the virtual disk to a fixed size VHD
+> [!NOTE]
+> If you're going to use Azure PowerShell to [upload your disk to Azure](disks-upload-vhd-to-managed-disk-powershell.md) and you have [Hyper-V](/windows-server/virtualization/hyper-v/hyper-v-technology-overview) enabled, this step is optional. [Add-AzVHD](/powershell/module/az.compute/add-azvhd?view=azps-7.1.0&viewFallbackFrom=azps-5.4.0) will perform it for you.
+ Use one of the methods in this section to convert and resize your virtual disk to the required format for Azure: 1. Back up the VM before you run the virtual disk conversion or resize process.
Use one of the methods in this section to convert and resize your virtual disk t
You can convert a virtual disk using the [Convert-VHD](/powershell/module/hyper-v/convert-vhd) cmdlet in PowerShell. If you need information about installing this cmdlet see [Install the Hyper-V role](/windows-server/virtualization/hyper-v/get-started/install-the-hyper-v-role-on-windows-server).
+> [!NOTE]
+> If you're going to use Azure PowerShell to [upload your disk to Azure](disks-upload-vhd-to-managed-disk-powershell.md) and you have [Hyper-V](/windows-server/virtualization/hyper-v/hyper-v-technology-overview) enabled, this step is optional. [Add-AzVHD](/powershell/module/az.compute/add-azvhd?view=azps-7.1.0&viewFallbackFrom=azps-5.4.0) will perform it for you.
+ The following example converts the disk from VHDX to VHD. It also converts the disk from a dynamically expanding disk to a fixed-size disk.
disk.
### Use Hyper-V Manager to resize the disk
+> [!NOTE]
+> If you're going to use Azure PowerShell to [upload your disk to Azure](disks-upload-vhd-to-managed-disk-powershell.md) and you have [Hyper-V](/windows-server/virtualization/hyper-v/hyper-v-technology-overview) enabled, this step is optional. [Add-AzVHD](/powershell/module/az.compute/add-azvhd?view=azps-7.1.0&viewFallbackFrom=azps-5.4.0) will perform it for you.
+ 1. Open Hyper-V Manager and select your local computer on the left. In the menu above the computer list, select **Action** > **Edit Disk**. 1. On the **Locate Virtual Hard Disk** page, select your virtual disk.
disk.
### Use PowerShell to resize the disk
+> [!NOTE]
+> If you're going to use Azure PowerShell to [upload your disk to Azure](disks-upload-vhd-to-managed-disk-powershell.md) and you have [Hyper-V](/windows-server/virtualization/hyper-v/hyper-v-technology-overview) enabled, this step is optional. [Add-AzVHD](/powershell/module/az.compute/add-azvhd?view=azps-7.1.0&viewFallbackFrom=azps-5.4.0) will perform it for you.
+ You can resize a virtual disk using the [Resize-VHD](/powershell/module/hyper-v/resize-vhd) cmdlet in PowerShell. If you need information about installing this cmdlet see [Install the Hyper-V role](/windows-server/virtualization/hyper-v/get-started/install-the-hyper-v-role-on-windows-server).
virtual-machines Jboss Eap Marketplace Image https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/workloads/redhat/jboss-eap-marketplace-image.md
Title: Azure Marketplace offer for Red Hat JBoss EAP on Azure Red Hat Enterprise Linux Virtual Machine (VM) and virtual machine scale sets description: How to deploy Red Hat JBoss EAP on Azure Red Hat Enterprise Linux (RHEL) VM and virtual machine scale sets using Azure Marketplace offer.--+++
virtual-machines Jboss Eap On Azure Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/workloads/redhat/jboss-eap-on-azure-best-practices.md
Title: Red Hat JBoss EAP on Azure Best Practices description: The guide provides information on the best practices for using Red Hat JBoss Enterprise Application Platform in Microsoft Azure.--+++
virtual-machines Jboss Eap On Azure Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/workloads/redhat/jboss-eap-on-azure-migration.md
Title: JBoss EAP to Azure virtual machines virtual machine scale sets migration guide description: This guide provides information on how to migrate your enterprise Java applications from another application server to JBoss EAP and from traditional on-premises server to Azure RHEL VM and virtual machine scale sets.--+++
For any support related questions, issues or customization requirements, please
* [Configuring a Java app for Azure App Service](../../../app-service/configure-language-java.md) * [How to deploy JBoss EAP onto Azure App Service](https://github.com/JasonFreeberg/jboss-on-app-service) tutorial * [Use Azure App Service Migration Assistance](https://azure.microsoft.com/services/app-service/migration-assistant/)
-* [Use Red Hat Migration Toolkit for Applications](https://developers.redhat.com/products/mta)
+* [Use Red Hat Migration Toolkit for Applications](https://developers.redhat.com/products/mta)
virtual-machines Jboss Eap On Rhel https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/virtual-machines/workloads/redhat/jboss-eap-on-rhel.md
Title: Quickstart - Deploy JBoss Enterprise Application Platform (EAP) on Red Hat Enterprise Linux (RHEL) to Azure VMs and virtual machine scale sets description: How to deploy enterprise Java applications by using Red Hat JBoss EAP on Azure RHEL VMs and virtual machine scale sets.--+++ Last updated 10/30/2020 ms.assetid: 8a4df7bf-be49-4198-800e-db381cda98f5 -
vpn-gateway Packet Capture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/vpn-gateway/packet-capture.md
Previously updated : 02/22/2021 Last updated : 01/31/2022
The following examples of JSON and a JSON schema provide explanations of each pr
} ```
-## Packet capture - portal
+## Start packet capture - portal
-You can set up packet capture in the Azure portal.
+You can set up packet capture in the Azure portal by navigating to the VPN Gateway Packet Capture blade in the Azure portal and clicking the **Start Packet Capture button**
+
+## Stop packet capture - portal
+
+A valid SAS (or Shared Access Signature) Uri with read/write access is required to complete a packet capture. When a packet capture is stopped, the output of the packet capture is written to the container that is referenced by the SAS Uri. To get the SAS Uri, navigate to the required storage account and generate a SAS token and URL with the correct permissions.
++
+* Copy the Blob SAS URL as it will be needed in the next step.
+
+* Navigate to the VPN Gateway Packet Capture blade in the Azure portal and clicking the **Stop Packet Capture** button
+
+* Paste the SAS URL (from the previous step) in the **Output Sas Uri** text box and click **Stop Packet Capture**.
++
+* The packet capture (pcap) file will be stored in the specified account
## Packet capture - PowerShell
vpn-gateway Troubleshoot Vpn With Azure Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/vpn-gateway/troubleshoot-vpn-with-azure-diagnostics.md
The official document
## <a name="P2SDiagnosticLog"></a>P2SDiagnosticLog
-The last available table for VPN diagnostics is **P2SDiagnosticLog**. This table traces the activity for Point to Site.
+The last available table for VPN diagnostics is **P2SDiagnosticLog**. This table traces the activity for Point to Site (only IKEv2 and OpenVPN protocols).
Here you have a sample query as reference.
Also, whenever a client will connect via IKEv2 or OpenVPN Point to Site, the tab
## Next Steps
-To configure alerts on tunnel resource logs, see [Set up alerts on VPN Gateway resource logs](vpn-gateway-howto-setup-alerts-virtual-network-gateway-log.md).
+To configure alerts on tunnel resource logs, see [Set up alerts on VPN Gateway resource logs](vpn-gateway-howto-setup-alerts-virtual-network-gateway-log.md).