Updates from: 02/02/2022 06:50:11
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/whats-new-docs.md
Welcome to what's new in Azure Active Directory B2C documentation. This article lists new docs that have been added and those that have had significant updates in the last three months. To learn what's new with the B2C service, see [What's new in Azure Active Directory](../active-directory/fundamentals/whats-new.md).
+## January 2022
+
+### Updated articles
+
+- [Tutorial: Secure Hybrid Access to applications with Azure AD B2C and F5 BIG-IP](partner-f5.md)
+- [Set up a force password reset flow in Azure Active Directory B2C](force-password-reset.md)
+- [Boolean claims transformations](boolean-transformations.md)
+- [Date claims transformations](date-transformations.md)
+- [General claims transformations](general-transformations.md)
+- [Integer claims transformations](integer-transformations.md)
+- [JSON claims transformations](json-transformations.md)
+- [Define phone number claims transformations in Azure AD B2C](phone-number-claims-transformations.md)
+- [Social accounts claims transformations](social-transformations.md)
+- [String claims transformations](string-transformations.md)
+- [StringCollection claims transformations](stringcollection-transformations.md)
+- [Billing model for Azure Active Directory B2C](billing.md)
+- [Configure SAML identity provider options with Azure Active Directory B2C](identity-provider-generic-saml-options.md)
+- [About claim resolvers in Azure Active Directory B2C custom policies](claim-resolver-overview.md)
+- [Add AD FS as a SAML identity provider using custom policies in Azure Active Directory B2C](identity-provider-adfs-saml.md)
+ ## December 2021 ### New articles
active-directory-domain-services Manage Group Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/manage-group-policy.md
This article shows you how to install the Group Policy Management tools, then ed
If you are interested in server management strategy, including machines in Azure and [hybrid connected](../azure-arc/servers/overview.md),
-consider reading how to
-[convert Group Policy content](../governance/policy/how-to/guest-configuration-create-group-policy.md)
-to the
+consider reading about the
[guest configuration](../governance/policy/concepts/guest-configuration.md) feature of [Azure Policy](../governance/policy/overview.md).
active-directory Concept Authentication Passwordless https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/concept-authentication-passwordless.md
The following providers offer FIDO2 security keys of different form factors that
| IDmelon Technologies Inc. | ![y] | ![y]| ![y]| ![y]| ![n] | https://www.idmelon.com/#idmelon | | Kensington | ![y] | ![y]| ![n]| ![n]| ![n] | https://www.kensington.com/solutions/product-category/why-biometrics/ | | KONA I | ![y] | ![n]| ![y]| ![y]| ![n] | https://konai.com/business/security/fido |
-| NEOWAVE | ![n] | ![y]| ![y]| ![n]| ![n] | https://neowave.fr/en/products/fido-range/ |
+| NeoWave | ![n] | ![y]| ![y]| ![n]| ![n] | https://neowave.fr/en/products/fido-range/ |
| Nymi | ![y] | ![n]| ![y]| ![n]| ![n] | https://www.nymi.com/nymi-band |
+| Octatco | ![y] | ![y]| ![n]| ![n]| ![n] | https://octatco.com/ |
| OneSpan Inc. | ![n] | ![y]| ![n]| ![y]| ![n] | https://www.onespan.com/products/fido | | Thales Group | ![n] | ![y]| ![y]| ![n]| ![n] | https://cpl.thalesgroup.com/access-management/authenticators/fido-devices | | Thetis | ![y] | ![y]| ![y]| ![y]| ![n] | https://thetis.io/collections/fido2 |
active-directory V2 Oauth2 Client Creds Grant Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-oauth2-client-creds-grant-flow.md
This type of authorization is common for daemons and service accounts that need
In order to enable this ACL-based authorization pattern, Azure AD doesn't require that applications be authorized to get tokens for another application. Thus, app-only tokens can be issued without a `roles` claim. Applications that expose APIs must implement permission checks in order to accept tokens.
-If you'd like to prevent applications from getting role-less app-only access tokens for your application, [ensure that user assignment requirements are enabled for your app](../manage-apps/what-is-access-management.md#requiring-user-assignment-for-an-app). This will block users and applications without assigned roles from being able to get a token for this application.
+If you'd like to prevent applications from getting role-less app-only access tokens for your application, [ensure that assignment requirements are enabled for your app](../manage-apps/what-is-access-management.md#requiring-user-assignment-for-an-app). This will block users and applications without assigned roles from being able to get a token for this application.
### Application permissions
Instead of using ACLs, you can use APIs to expose a set of **application permiss
* Send mail as any user * Read directory data
-To use application permissions with your own API (as opposed to Microsoft Graph), you must first [expose the API](howto-add-app-roles-in-azure-ad-apps.md) by defining scopes in the API's app registration in the Azure portal. Then, [configure access to the API](howto-add-app-roles-in-azure-ad-apps.md#assign-app-roles-to-applications) by selecting those permissions in your client application's app registration. If you haven't exposed any scopes in your API's app registration, you won't be able to specify application permissions to that API in your client application's app registration in the Azure portal.
+To use app roles (application permissions) with your own API (as opposed to Microsoft Graph), you must first [expose the app roles](howto-add-app-roles-in-azure-ad-apps.md) in the API's app registration in the Azure portal. Then, [configure the required app roles](howto-add-app-roles-in-azure-ad-apps.md#assign-app-roles-to-applications) by selecting those permissions in your client application's app registration. If you haven't exposed any app roles in your API's app registration, you won't be able to specify application permissions to that API in your client application's app registration in the Azure portal.
-When authenticating as an application (as opposed to with a user), you can't use *delegated permissions* - scopes that are granted by a user - because there is no user for your app to act on behalf of. You must use application permissions, also known as roles, that are granted by an admin for the application or via pre-authorization by the web API.
+When authenticating as an application (as opposed to with a user), you can't use *delegated permissions* because there is no user for your app to act on behalf of. You must use application permissions, also known as app roles, that are granted by an admin or by the API's owner.
For more information about application permissions, see [Permissions and consent](v2-permissions-and-consent.md#permission-types).
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/whats-new-docs.md
Previously updated : 01/03/2022 Last updated : 02/01/2022
Welcome to what's new in the Microsoft identity platform documentation. This article lists new docs that have been added and those that have had significant updates in the last three months.
+## January 2022
+
+### New articles
+
+- [Access Azure AD protected resources from an app in Google Cloud (preview)](workload-identity-federation-create-trust-gcp.md)
+- [Quickstart: Acquire a token and call the Microsoft Graph API by using a console app's identity](console-app-quickstart.md)
+- [Quickstart: Acquire a token and call Microsoft Graph API from a desktop application](desktop-app-quickstart.md)
+- [Quickstart: Add sign-in with Microsoft to a web app](web-app-quickstart.md)
+- [Quickstart: Protect a web API with the Microsoft identity platform](web-api-quickstart.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from a mobile application](mobile-app-quickstart.md)
+
+### Updated articles
+
+- [Confidential client assertions](msal-net-client-assertions.md)
+- [Claims mapping policy type](reference-claims-mapping-policy-type.md)
+- [Configure an app to trust a GitHub repo (preview)](workload-identity-federation-create-trust-github.md)
+- [Configure an app to trust an external identity provider (preview)](workload-identity-federation-create-trust.md)
+- [Exchange a SAML token issued by AD FS for a Microsoft Graph access token](v2-saml-bearer-assertion.md)
+- [Logging in MSAL.js](msal-logging-js.md)
+- [Permissions and consent in the Microsoft identity platform](v2-permissions-and-consent.md)
+- [Quickstart: Acquire a token and call Microsoft Graph API from a Java console app using app's identity](quickstart-v2-java-daemon.md)
+- [Quickstart: Acquire a token and call Microsoft Graph API from a Python console app using app's identity](quickstart-v2-python-daemon.md)
+- [Quickstart: Add sign-in with Microsoft to a Java web app](quickstart-v2-java-webapp.md)
+- [Quickstart: Add sign-in with Microsoft to a Python web app](quickstart-v2-python-webapp.md)
+- [Quickstart: Add sign-in with Microsoft to an ASP.NET Core web app](quickstart-v2-aspnet-core-webapp.md)
+- [Quickstart: ASP.NET web app that signs in Azure AD users](quickstart-v2-aspnet-webapp.md)
+- [Quickstart: Get a token and call the Microsoft Graph API by using a console app's identity](quickstart-v2-netcore-daemon.md)
+- [Quickstart: Protect an ASP.NET Core web API with the Microsoft identity platform](quickstart-v2-aspnet-core-web-api.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from an Android app](quickstart-v2-android.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from an iOS or macOS app](quickstart-v2-ios.md)
+ ## December 2021 ### New articles
Welcome to what's new in the Microsoft identity platform documentation. This art
- [Token cache serialization in MSAL.NET](msal-net-token-cache-serialization.md) - [What's new for authentication?](reference-breaking-changes.md)
-## October 2021
-
-### New articles
--- [Configure an app to trust a GitHub repo (preview)](workload-identity-federation-create-trust-github.md)-- [Configure an app to trust an external identity provider (preview)](workload-identity-federation-create-trust.md)-- [Set up your application's Azure AD test environment](test-setup-environment.md)-- [Throttling and service limits to consider for testing](test-throttle-service-limits.md)-- [Workload identity federation (preview)](workload-identity-federation.md)-
-### Updated articles
--- [Considerations for using Xamarin iOS with MSAL.NET](msal-net-xamarin-ios-considerations.md)-- [Handle ITP in Safari and other browsers where third-party cookies are blocked](reference-third-party-cookies-spas.md)-- [Initialize client applications using MSAL.js](msal-js-initializing-client-applications.md)-- [Microsoft Graph API](microsoft-graph-intro.md)-- [Microsoft identity platform and the OAuth 2.0 client credentials flow](v2-oauth2-client-creds-grant-flow.md)-- [What's new for authentication?](reference-breaking-changes.md)
active-directory Groups Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-lifecycle.md
For information on how to download and install the Azure AD PowerShell cmdlets,
## Activity-based automatic renewal
-With Azure AD intelligence, groups are now automatically renewed based on whether they have been recently used. This feature eliminates the need for manual action by group owners, because it's based on user activity in groups across Microsoft 365 services like Outlook, SharePoint, or Teams. For example, if an owner or a group member does something like upload a document to SharePoint, visit a Teams channel, or send an email to the group in Outlook, the group is automatically renewed around 35 days before the group expires and the owner does not get any renewal notifications. The "All Company" group converted in Yammer Native Mode to a Microsoft 365 Group doesn't currently support this type of automatic renewal, and Yammer activities for that group aren't counted as activities.
+With Azure AD intelligence, groups are now automatically renewed based on whether they have been recently used. This feature eliminates the need for manual action by group owners, because it's based on user activity in groups across Microsoft 365 services like Outlook, SharePoint, Teams, or Yammer. For example, if an owner or a group member does something like upload a document to SharePoint, visit a Teams channel, send an email to the group in Outlook, or view a post in Yammer, the group is automatically renewed around 35 days before the group expires and the owner does not get any renewal notifications.
For example, consider an expiration policy that is set so that a group expires after 30 days of inactivity. However, to keep from sending an expiration email the day that group expiration is enabled (because there's no record activity yet), Azure AD first waits five days. If there is activity in those five days, the expiration policy works as expected. If there is no activity within five days, we send an expiration/renewal email. Of course, if the group was inactive for five days, an email was sent, and then the group was active, we will autorenew it and start the expiration period again.
The following user actions cause automatic group renewal:
- SharePoint: View, edit, download, move, share, or upload files - Outlook: Join group, read/write group message from group space, Like a message (in Outlook Web Access) - Teams: Visit a Teams channel
+- Yammer: View a post within a Yammer community or an interactive email in Outlook
### Auditing and reporting
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 01/28/2022 Last updated : 01/31/2022
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information last updated on January 28th, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
+>This information last updated on January 31st, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
><br/> | Product name | String ID | GUID | Service plans included | Service plans included (friendly names) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Dynamics 365 Enterprise Edition - Additional Portal (Qualified Offer) | CRM_ONLINE_PORTAL | a4bfb28e-becc-41b0-a454-ac680dc258d3 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>CRM_ONLINE_PORTAL (1d4e9cb1-708d-449c-9f71-943aa8ed1d6a) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Dynamics CRM Online - Portal Add-On (1d4e9cb1-708d-449c-9f71-943aa8ed1d6a) | | Dynamics 365 Field Service Viral Trial | Dynamics_365_Field_Service_Enterprise_viral_trial | 29fcd665-d8d1-4f34-8eed-3811e3fca7b3 | CUSTOMER_VOICE_DYN365_VIRAL_TRIAL (dbe07046-af68-4861-a20d-1c8cbda9194f)<br/>DYN365_FS_ENTERPRISE_VIRAL_TRIAL (20d1455b-72b2-4725-8354-a177845ab77d)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>POWER_APPS_DYN365_VIRAL_TRIAL (54b37829-818e-4e3c-a08a-3ea66ab9b45d)<br/>POWER_AUTOMATE_DYN365_VIRAL_TRIAL (81d4ecb8-0481-42fb-8868-51536c5aceeb) | Customer Voice for Dynamics 365 vTrial (dbe07046-af68-4861-a20d-1c8cbda9194f)<br/>Dynamics 365 Field Service Enterprise vTrial (20d1455b-72b2-4725-8354-a177845ab77d)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Power Apps for Dynamics 365 vTrial (54b37829-818e-4e3c-a08a-3ea66ab9b45d)<br/>Power Automate for Dynamics 365 vTrial (81d4ecb8-0481-42fb-8868-51536c5aceeb) | | Dynamics 365 Finance | DYN365_FINANCE | 55c9eb4e-c746-45b4-b255-9ab6b19d5c62 | DYN365_CDS_FINANCE (e95d7060-d4d9-400a-a2bd-a244bf0b609e)<br/>DYN365_REGULATORY_SERVICE (c7657ae3-c0b0-4eed-8c1d-6a7967bd9c65)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>D365_Finance (9f0e1b4e-9b33-4300-b451-b2c662cd4ff7)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba) | Common Data Service for Dynamics 365 Finance (e95d7060-d4d9-400a-a2bd-a244bf0b609e)<br/>Dynamics 365 for Finance and Operations, Enterprise edition - Regulatory Service (c7657ae3-c0b0-4eed-8c1d-6a7967bd9c65)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Dynamics 365 for Finance (9f0e1b4e-9b33-4300-b451-b2c662cd4ff7)<br/>Power Apps for Dynamics 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>Power Automate for Dynamics 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba) |
-| DYNAMICS 365 FOR CUSTOMER SERVICE ENTERPRISE EDITION | DYN365_ENTERPRISE_CUSTOMER_SERVICE | 749742bf-0d37-4158-a120-33567104deeb | DYN365_ENTERPRISE_CUSTOMER_SERVICE (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |MICROSOFT SOCIAL ENGAGEMENT - SERVICE DISCONTINUATION (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>PROJECT ONLINE ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>DYNAMICS 365 FOR CUSTOMER SERVICE (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |
+| Dynamics 365 for Customer Service Enterprise Edition | DYN365_ENTERPRISE_CUSTOMER_SERVICE | 749742bf-0d37-4158-a120-33567104deeb | D365_CSI_EMBED_CSEnterprise (5b1e5982-0e88-47bb-a95e-ae6085eda612)<br/>DYN365_ENTERPRISE_CUSTOMER_SERVICE (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Forms_Pro_Service (67bf4812-f90b-4db9-97e7-c0bbbf7b2d09)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72) | Dynamics 365 Customer Service Insights for CS Enterprise (5b1e5982-0e88-47bb-a95e-ae6085eda612)<br/>Dynamics 365 for Customer Service (99340b49-fb81-4b1e-976b-8f2ae8e9394f)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Dynamics 365 Customer Voice for Customer Service Enterprise (67bf4812-f90b-4db9-97e7-c0bbbf7b2d09)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Dynamics 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>Power Automate for Dynamics 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>Project Online Essentials (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>Retired - Microsoft Social Engagement (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72) |
| DYNAMICS 365 FOR FINANCIALS BUSINESS EDITION | DYN365_FINANCIALS_BUSINESS_SKU | cc13a803-544e-4464-b4e4-6d6169a138fa | DYN365_FINANCIALS_BUSINESS (920656a2-7dd8-4c83-97b6-a356414dbd36)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b) |FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>DYNAMICS 365 FOR FINANCIALS (920656a2-7dd8-4c83-97b6-a356414dbd36) | | DYNAMICS 365 FOR SALES AND CUSTOMER SERVICE ENTERPRISE EDITION | DYN365_ENTERPRISE_SALES_CUSTOMERSERVICE | 8edc2cf8-6438-4fa9-b6e3-aa1660c640cc | DYN365_ENTERPRISE_P1 (d56f3deb-50d8-465a-bedb-f079817ccac1)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |DYNAMICS 365 CUSTOMER ENGAGEMENT PLAN (d56f3deb-50d8-465a-bedb-f079817ccac1)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>MICROSOFT SOCIAL ENGAGEMENT - SERVICE DISCONTINUATION (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT ONLINE ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014) | | DYNAMICS 365 FOR SALES ENTERPRISE EDITION | DYN365_ENTERPRISE_SALES | 1e1a282c-9c54-43a2-9310-98ef728faace | DYN365_ENTERPRISE_SALES (2da8e897-7791-486b-b08f-cc63c8129df7)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>NBENTERPRISE (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT_ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014) | DYNAMICS 365 FOR SALES (2da8e897-7791-486b-b08f-cc63c8129df7)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>MICROSOFT SOCIAL ENGAGEMENT - SERVICE DISCONTINUATION (03acaee3-9492-4f40-aed4-bcb6b32981b6)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b)<br/>PROJECT ONLINE ESSENTIALS (1259157c-8581-4875-bca7-2ffb18c51bda)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014) |
active-directory Service Accounts Governing Azure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/service-accounts-governing-azure.md
There are three types of service accounts in Azure Active Directory (Azure AD): [managed identities](service-accounts-managed-identities.md), [service principals](service-accounts-principal.md), and user accounts employed as service accounts. As you create these service accounts for automated use, they're granted permissions to access resources in Azure and Azure AD. Resources can include Microsoft 365 services, software as a service (SaaS) applications, custom applications, databases, HR systems, and so on. Governing Azure AD service accounts means that you manage their creation, permissions, and lifecycle to ensure security and continuity. > [!IMPORTANT]
-> We do not recommend using user accounts as service accounts as they are inherently less secure. This includes on-premises service accounts that are synced to Azure AD, as they are not converted to service principals. Instead, we recommend the use of managed identities or service principals. Note that at this time the use of conditional access policies is not possible with service principals, but the functionality is coming.
+> We do not recommend using user accounts as service accounts as they are inherently less secure. This includes on-premises service accounts that are synced to Azure AD, as they are not converted to service principals. Instead, we recommend the use of managed identities or service principals. Note that at this time the use of conditional access policies with service principals is called Conditional Access for workload identities and it's in public preview.
## Plan your service account
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/reference-connect-version-history.md
ms.assetid: ef2797d7-d440-4a9a-a648-db32ad137494
Previously updated : 10/21/2021 Last updated : 1/31/2022
Topic | Details
Steps to upgrade from Azure AD Connect | Different methods to [upgrade from a previous version to the latest](how-to-upgrade-previous-version.md) Azure AD Connect release. Required permissions | For permissions required to apply an update, see [Azure AD Connect: Accounts and permissions](reference-connect-accounts-permissions.md#upgrade).
+## Retiring Azure AD Connect 1.x versions
> [!IMPORTANT] > *On August 31, 2022, all 1.x versions of Azure AD Connect will be retired because they include SQL Server 2012 components that will no longer be supported.* Upgrade to the most recent version of Azure AD Connect (2.x version) by that date or [evaluate and switch to Azure AD cloud sync](../cloud-sync/what-is-cloud-sync.md).
-Make sure you're running a recent version of Azure AD Connect to receive an optimal support experience.
+## Retiring Azure AD Connect 2.x versions
+> [!IMPORTANT]
+> We will begin retiring past versions of Azure AD Connect Sync 2.x 12 months from the date they are superseded by a newer version.
+> This policy will go into effect on 15 March 2023, when we will retire all versions that are superseded by a newer version on 15 March 2022.
+>
+> The following versions will retire on 15 March 2023:
+>
+> - 2.0.89.0
+> - 2.0.88.0
+> - 2.0.28.0
+> - 2.0.25.1
+> - 2.0.10.0
+> - 2.0.9.0
+> - 2.0.8.0
+> - 2.0.3.0
+>
+> If you are not already using the latest release version of Azure AD Connect Sync, you should upgrade your Azure AD Connect Sync software before that date.
+>
+> This policy does not change the retirement of all 1.x versions of Azure AD Connect Sync on 31 August 2022, which is due to the retirement of the SQL Server 2012 and Azure AD Authentication Library (ADAL) components.
If you run a retired version of Azure AD Connect, it might unexpectedly stop working. You also might not have the latest security fixes, performance improvements, troubleshooting and diagnostic tools, and service enhancements. If you require support, we might not be able to provide you with the level of service your organization needs.
active-directory Application Sign In Other Problem Access Panel https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/application-sign-in-other-problem-access-panel.md
Previously updated : 07/11/2017 Last updated : 02/01/2022
To check if you have the correct deep link, follow these steps:
4. Select **Enterprise Applications** from the Azure Active Directory left-hand navigation menu. 5. Select **All Applications** to view a list of all your applications. - If you do not see the application you want show up here, use the **Filter** control at the top of the **All Applications List** and set the **Show** option to **All Applications.**
-6. Open the [**Azure portal**](https://portal.azure.com/) and sign in as a **Global Administrator** or **Co-admin.**
-7. Open the **Azure Active Directory Extension** by selecting **All services** at the top of the main left-hand navigation menu.
-8. Type in **ΓÇ£Azure Active Directory**ΓÇ¥ in the filter search box and select the **Azure Active Directory** item.
-9. Select **Enterprise Applications** from the Azure Active Directory left-hand navigation menu.
-10. Select **All Applications** to view a list of all your applications.
- - If you do not see the application you want show up here, use the **Filter** control at the top of the **All Applications List** and set the **Show** option to **All Applications.**
-11. Select the application you want the check the deep link for.
-12. Find the label **User Access URL**. Your deep link should match this URL.
+6. Select the application you want the check the deep link for.
+7. Find the label **User Access URL**. Your deep link should match this URL.
## Contact support
Open a support ticket with the following information if available:
## Next steps -- [Quickstart Series on Application Management](view-applications-portal.md)
+- [Quickstart Series on Application Management](view-applications-portal.md)
active-directory Services Azure Active Directory Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/services-azure-active-directory-support.md
description: List of services that support Azure AD authentication
Previously updated : 01/10/2022 Last updated : 02/01/2022
The following services support Azure AD authentication. New services are added t
| Azure Databricks | [Authenticate using Azure Active Directory tokens](/azure/databricks/dev-tools/api/latest/aad/) | Azure Data Explorer | [How-To Authenticate with Azure Active Directory for Azure Data Explorer Access](/azure/data-explorer/kusto/management/access-control/how-to-authenticate-with-aad) | | Azure Data Lake Storage Gen1 | [Authentication with Azure Data Lake Storage Gen1 using Azure Active Directory](../../data-lake-store/data-lakes-store-authentication-using-azure-active-directory.md) |
+| Azure Database for PostgreSQL | [Use Azure Active Directory for authentication with PostgreSQL](../../postgresql/howto-configure-sign-in-aad-authentication.md)
| Azure Digital Twins | [Set up an Azure Digital Twins instance and authentication (portal)](../../digital-twins/how-to-set-up-instance-portal.md#set-up-user-access-permissions) | | Azure Event Hubs | [Authenticate an application with Azure Active Directory to access Event Hubs resources](../../event-hubs/authenticate-application.md) | Azure IoT Hub | [Control access to IoT Hub](../../iot-hub/iot-hub-devguide-security.md) |
aks Concepts Scale https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/concepts-scale.md
To get started with manually scaling pods and nodes see [Scale applications in A
## Horizontal pod autoscaler
-Kubernetes uses the horizontal pod autoscaler (HPA) to monitor the resource demand and automatically scale the number of replicas. By default, the horizontal pod autoscaler checks the Metrics API every 30 seconds for any required changes in replica count. When changes are required, the number of replicas is increased or decreased accordingly. Horizontal pod autoscaler works with AKS clusters that have deployed the Metrics Server for Kubernetes 1.8+.
+Kubernetes uses the horizontal pod autoscaler (HPA) to monitor the resource demand and automatically scale the number of replicas. By default, the horizontal pod autoscaler checks the Metrics API every 60 seconds for any required changes in replica count. When changes are required, the number of replicas is increased or decreased accordingly. Horizontal pod autoscaler works with AKS clusters that have deployed the Metrics Server for Kubernetes 1.8+.
![Kubernetes horizontal pod autoscaling](media/concepts-scale/horizontal-pod-autoscaling.png)
aks Open Service Mesh About https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/open-service-mesh-about.md
OSM can be used to help your AKS deployments in many different ways. For example
- Configure weighted traffic controls between two or more services for A/B testing or canary deployments. - Collect and view KPIs from application traffic.
+## Add-on limitations
+
+The OSM AKS add-on has the following limitations:
+
+* [Iptables redirection][ip-tables-redirection] for port IP address and port range exclusion must be enabled using `kubectl patch` after installation. For more details, see [iptables redirection][ip-tables-redirection].
+* Pods that are onboarded to the mesh that need access to IMDS, Azure DNS, or the Kubernetes API server must have their IP addresses to the global list of excluded outbound IP ranges using [Global outbound IP range exclusions][global-exclusion].
[osm-azure-cli]: open-service-mesh-deploy-addon-az-cli.md
-[osm-bicep]: open-service-mesh-deploy-addon-bicep.md
+[osm-bicep]: open-service-mesh-deploy-addon-bicep.md
+[ip-tables-redirection]: https://docs.openservicemesh.io/docs/guides/traffic_management/iptables_redirection/
+[global-exclusion]: https://docs.openservicemesh.io/docs/guides/traffic_management/iptables_redirection/#global-outbound-ip-range-exclusions
aks Open Service Mesh Ip Port Exclusion https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/open-service-mesh-ip-port-exclusion.md
- Title: IP and port range exclusion
-description: Implement IP and port range exclusion
-- Previously updated : 10/8/2021---
-# Implement IP and <span class="x x-first x-last">port range exclusion</span>
-
-Outbound TCP based traffic from applications is by default intercepted using the `iptables` rules programmed by OSM, and redirected to the Envoy proxy sidecar. OSM provides a means to specify a list of IP ranges and ports to exclude from traffic interception if necessary. For guidance on how to exclude IP and port ranges, refer to [this documentation](https://docs.openservicemesh.io/docs/guides/traffic_management/iptables_redirection/).
-
-> [!NOTE]
->
-> - For the Open Service Mesh AKS add-on, **port exclusion can only be implemented after installation using `kubectl patch` and not during installation using the OSM CLI `--set` flag.**
-> - If the application pods that are a part of the mesh need access to IMDS, Azure DNS or the Kubernetes API server, the user needs to explicitly add these IP addresses to the list of Global outbound IP ranges using the above command. See an example of Kubernetes API Server port exclusion [here](https://docs.openservicemesh.io/docs/guides/app_onboarding/#onboard-services).
aks Spot Node Pool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/spot-node-pool.md
description: Learn how to add a spot node pool to an Azure Kubernetes Service (A
Previously updated : 10/19/2020 Last updated : 01/21/2022 #Customer intent: As a cluster operator or developer, I want to learn how to add a spot node pool to an AKS Cluster.
The following limitations apply when you create and manage AKS clusters with a s
* You cannot change ScaleSetPriority or SpotMaxPrice after creation. * When setting SpotMaxPrice, the value must be -1 or a positive value with up to five decimal places. * A spot node pool will have the label *kubernetes.azure.com/scalesetpriority:spot*, the taint *kubernetes.azure.com/scalesetpriority=spot:NoSchedule*, and system pods will have anti-affinity.
-* You must add a [corresponding toleration][spot-toleration] to schedule workloads on a spot node pool.
+* You must add a [corresponding toleration][spot-toleration] and affinity to schedule workloads on a spot node pool.
## Add a spot node pool to an AKS cluster
az aks nodepool show --resource-group myResourceGroup --cluster-name myAKSCluste
Confirm *scaleSetPriority* is *Spot*.
-To schedule a pod to run on a spot node, add a toleration that corresponds to the taint applied to your spot node. The following example shows a portion of a yaml file that defines a toleration that corresponds to a *kubernetes.azure.com/scalesetpriority=spot:NoSchedule* taint used in the previous step.
+To schedule a pod to run on a spot node, add a toleration and node affinity that corresponds to the taint applied to your spot node. The following example shows a portion of a yaml file that defines a toleration that corresponds to the *kubernetes.azure.com/scalesetpriority=spot:NoSchedule* taint and a node affinity that corresponds to the *kubernetes.azure.com/scalesetpriority=spot* label used in the previous step.
```yaml spec:
spec:
operator: "Equal" value: "spot" effect: "NoSchedule"
+ affinity:
+ nodeAffinity:
+ requiredDuringSchedulingIgnoredDuringExecution:
+ nodeSelectorTerms:
+ - matchExpressions:
+ - key: "kubernetes.azure.com/scalesetpriority"
+ operator: In
+ values:
+ - "spot"
... ```
-When a pod with this toleration is deployed, Kubernetes can successfully schedule the pod on the nodes with the taint applied.
+When a pod with this toleration and node affinity is deployed, Kubernetes will successfully schedule the pod on the nodes with the taint and label applied.
## Max price for a spot pool [Pricing for spot instances is variable][pricing-spot], based on region and SKU. For more information, see pricing for [Linux][pricing-linux] and [Windows][pricing-windows].
In this article, you learned how to add a spot node pool to an AKS cluster. For
[aks-support-policies]: support-policies.md [aks-faq]: faq.md [azure-cli-install]: /cli/azure/install-azure-cli
-[az-aks-nodepool-add]: /cli/azure/aks/nodepool#az_aks_nodepool_add
+[az-aks-nodepool-add]: /cli/azure/aks/nodepool#az-aks-nodepool-add
[cluster-autoscaler]: cluster-autoscaler.md [eviction-policy]: ../virtual-machine-scale-sets/use-spot.md#eviction-policy [kubernetes-concepts]: concepts-clusters-workloads.md
api-management Api Management Get Started Publish Versions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/api-management-get-started-publish-versions.md Binary files differ
api-management Quickstart Arm Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/quickstart-arm-template.md
tags: azure-resource-manager -+ Last updated 10/09/2020
app-service Configure Ssl Certificate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-ssl-certificate.md
The free App Service managed certificate is a turn-key solution for securing you
The free certificate comes with the following limitations: - Does not support wildcard certificates.-- Does not support usage as a client certificate by certificate thumbprint (removal of certificate thumbprint is planned).
+- Does not support usage as a client certificate by using certificate thumbprint (removal of certificate thumbprint is planned).
- Does not support private DNS. - Is not exportable. - Is not supported on App Service Environment (ASE).
The free certificate comes with the following limitations:
# [Apex domain](#tab/apex) - Must have an A record pointing to your web app's IP address.
+- Is not supported on apps that are not publicly accessible.
- Is not supported with root domains that are integrated with Traffic Manager.-- All the above must be met for successful certificate issuances and renewals
+- All the above must be met for successful certificate issuances and renewals.
# [Subdomain](#tab/subdomain)-- Must have CNAME mapped _directly_ to \<app-name\>.azurewebsites.net; using services that proxy the CNAME value will block certificate issuance and renewal-- All the above must be met for successful certificate issuance and renewals
+- Must have CNAME mapped _directly_ to `<app-name>.azurewebsites.net`. Mapping to an intermediate CNAME value will block certificate issuance and renewal.
+- All the above must be met for successful certificate issuance and renewals.
--
Click **Rekey** to start the process. This process can take 1-10 minutes to comp
Rekeying your certificate rolls the certificate with a new certificate issued from the certificate authority.
-You may be required to [re-verify domain ownership](#verify-domain-ownership).
+You may be required to [reverify domain ownership](#verify-domain-ownership).
Once the rekey operation is complete, click **Sync**. The sync operation automatically updates the hostname bindings for the certificate in App Service without causing any downtime to your apps.
Once the rekey operation is complete, click **Sync**. The sync operation automat
Because an App Service Certificate is a [Key Vault secret](../key-vault/general/about-keys-secrets-certificates.md), you can export a PFX copy of it and use it for other Azure services or outside of Azure.
+> [!NOTE]
+> The exported certificate is an unmanaged artifact. For example, it isn't synced when the App Service Certificate is [renewed](#renew-an-app-service-certificate). You must export the renewed certificate and install it where you need it.
+ To export the App Service Certificate as a PFX file, run the following commands in the [Cloud Shell](https://shell.azure.com). You can also run it locally if you [installed Azure CLI](/cli/azure/install-azure-cli). Replace the placeholders with the names you used when you [created the App Service certificate](#start-certificate-order). ```azurecli-interactive
The downloaded *appservicecertificate.pfx* file is a raw PKCS12 file that contai
### Delete certificate
-Deletion of an App Service certificate is final and irreversible. Deletion of a App Service Certificate resource results in the certificate being revoked. Any binding in App Service with this certificate becomes invalid. To prevent accidental deletion, Azure puts a lock on the certificate. To delete an App Service certificate, you must first remove the delete lock on the certificate.
+Deletion of an App Service certificate is final and irreversible. Deletion of an App Service Certificate resource results in the certificate being revoked. Any binding in App Service with this certificate becomes invalid. To prevent accidental deletion, Azure puts a lock on the certificate. To delete an App Service certificate, you must first remove the delete lock on the certificate.
Select the certificate in the [App Service Certificates](https://portal.azure.com/#blade/HubsExtension/Resources/resourceType/Microsoft.CertificateRegistration%2FcertificateOrders) page, then select **Locks** in the left navigation.
app-service How To Migrate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/how-to-migrate.md
Title: How to migrate App Service Environment v2 to App Service Environment v3
description: Learn how to migrate your App Service Environment v2 to App Service Environment v3 Previously updated : 1/28/2022 Last updated : 2/01/2022 zone_pivot_groups: app-service-cli-portal
Ensure you understand how migrating to an App Service Environment v3 will affect
::: zone pivot="experience-azcli"
-When using the Azure CLI to carry out the migration, you should follow the below steps in order and as written since you'll be making Azure REST API calls. The recommended way for making these calls is by using the [Azure CLI](/cli/azure/). For information about other methods, see [Getting Started with Azure REST](/rest/api/azure/).
+The recommended experience for migration is using the [Azure portal](how-to-migrate.md?pivots=experience-azp). If you decide to use the Azure CLI to carry out the migration, you should follow the below steps in order and as written since you'll be making Azure REST API calls. The recommended way for making these API calls is by using the [Azure CLI](/cli/azure/). For information about other methods, see [Getting Started with Azure REST](/rest/api/azure/).
For this guide, [install the Azure CLI](/cli/azure/install-azure-cli) or use the [Azure Cloud Shell](https://shell.azure.com/).
app-service Overview Arc Integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/overview-arc-integration.md
Title: 'App Service on Azure Arc' description: An introduction to App Service integration with Azure Arc for Azure operators. Previously updated : 12/03/2021 Last updated : 01/31/2022 # App Service, Functions, and Logic Apps on Azure Arc (Preview)
You can run App Service, Functions, and Logic Apps on an Azure Arc-enabled Kuber
> [!NOTE] > To learn how to set up your Kubernetes cluster for App Service, Functions, and Logic Apps, see [Create an App Service Kubernetes environment (Preview)](manage-create-arc-environment.md).
-In most cases, app developers need to know nothing more than how to deploy to the correct Azure region that represents the deployed Kubernetes environment. For operators who provide the environment and maintain the underlying Kubernetes infrastructure, you need to be aware of the following Azure resources:
+In most cases, app developers need to know nothing more than how to deploy to the correct Azure region that represents the deployed Kubernetes environment. For operators who provide the environment and maintain the underlying Kubernetes infrastructure, you must be aware of the following Azure resources:
- The connected cluster, which is an Azure projection of your Kubernetes infrastructure. For more information, see [What is Azure Arc-enabled Kubernetes?](../azure-arc/kubernetes/overview.md).-- A cluster extension, which is a sub-resource of the connected cluster resource. The App Service extension [installs the required pods into your connected cluster](#pods-created-by-the-app-service-extension). For more information about cluster extensions, see [Cluster extensions on Azure Arc-enabled Kubernetes](../azure-arc/kubernetes/conceptual-extensions.md).
+- A cluster extension, which is a subresource of the connected cluster resource. The App Service extension [installs the required pods into your connected cluster](#pods-created-by-the-app-service-extension). For more information about cluster extensions, see [Cluster extensions on Azure Arc-enabled Kubernetes](../azure-arc/kubernetes/conceptual-extensions.md).
- A custom location, which bundles together a group of extensions and maps them to a namespace for created resources. For more information, see [Custom locations on top of Azure Arc-enabled Kubernetes](../azure-arc/kubernetes/conceptual-custom-locations.md).-- An App Service Kubernetes environment, which enables configuration common across apps but not related to cluster operations. Conceptually, it's deployed into the custom location resource, and app developers create apps into this environment. This is described in greater detail in [App Service Kubernetes environment](#app-service-kubernetes-environment).
+- An App Service Kubernetes environment, which enables configuration common across apps but not related to cluster operations. Conceptually, it's deployed into the custom location resource, and app developers create apps into this environment. This resource is described in greater detail in [App Service Kubernetes environment](#app-service-kubernetes-environment).
## Public preview limitations
-The following public preview limitations apply to App Service Kubernetes environments. They will be updated as changes are made available.
+The following public preview limitations apply to App Service Kubernetes environments. This list of limitations is updated as changes and features are made available.
| Limitation | Details | |||
The following public preview limitations apply to App Service Kubernetes environ
| Feature: Key vault references | Not available (depends on managed identities) | | Feature: Pull images from ACR with managed identity | Not available (depends on managed identities) | | Feature: In-portal editing for Functions and Logic Apps | Not available |
+| Feature: Portal listing of Functions or keys | Not available if cluster is not publicly reachable |
| Feature: FTP publishing | Not available | | Logs | Log Analytics must be configured with cluster extension; not per-site | ## Pods created by the App Service extension
-When the App Service extension is installed on the Azure Arc-enabled Kubernetes cluster, you see several pods created in the release namespace that was specified. These pods enable your Kubernetes cluster to be an extension of the `Microsoft.Web` resource provider in Azure and support the management and operation of your apps. Optionally, you can choose to have the extension install [KEDA](https://keda.sh/) for event-driven scaling.
+When the App Service extension is installed on the Azure Arc-enabled Kubernetes cluster, several pods are created in the release namespace that was specified. These pods enable your Kubernetes cluster to be an extension of the `Microsoft.Web` resource provider in Azure and support the management and operation of your apps. Optionally, you can choose to have the extension install [KEDA](https://keda.sh/) for event-driven scaling.
<!-- You can only have one installation of KEDA on the cluster. If you have one already, you must disable this behavior during installation of the cluster extension `TODO`. --> The following table describes the role of each pod that is created by default:
The following table describes the role of each pod that is created by default:
## App Service Kubernetes environment
-The App Service Kubernetes environment resource is required before apps may be created. It enables configuration common to apps in the custom location, such as the default DNS suffix.
+The App Service Kubernetes environment resource is required before apps can be created. It enables configuration common to apps in the custom location, such as the default DNS suffix.
-Only one Kubernetes environment resource may be created in a custom location. In most cases, a developer who creates and deploys apps doesn't need to be directly aware of the resource. It can be directly inferred from the provided custom location ID. However, when defining Azure Resource Manager templates, any plan resource needs to reference the resource ID of the environment directly. The custom location values of the plan and the specified environment must match.
+Only one Kubernetes environment resource can be created in a custom location. In most cases, a developer who creates and deploys apps doesn't need to be directly aware of the resource. It can be directly inferred from the provided custom location ID. However, when defining Azure Resource Manager templates, any plan resource needs to reference the resource ID of the environment directly. The custom location values of the plan and the specified environment must match.
## FAQ for App Service, Functions, and Logic Apps on Azure Arc (Preview)
No. Apps cannot be assigned managed identities when running in Azure Arc. If you
### Are there any scaling limits?
-All applications deployed with Azure App Service on Kubernetes with Azure Arc are able to scale within the limits of the underlying Kubernetes cluster. If the underlying Kubernetes Cluster runs out of available compute resources (CPU and memory primarily), then applications will only be able to scale to the number of instances of the application that Kubernetes can schedule with available resource.
+All applications deployed with Azure App Service on Kubernetes with Azure Arc are able to scale within the limits of the underlying Kubernetes cluster. If the underlying Kubernetes Cluster runs out of available compute resources (CPU and memory primarily), then applications will only be able to scale to the number of instances of the application that Kubernetes can schedule with available resource.
### What logs are collected?
-Logs for both system components and your applications are written to standard output. Both log types can be collected for analysis using standard Kubernetes tools. You can also configure the App Service cluster extension with a [Log Analytics workspace](../azure-monitor/logs/log-analytics-overview.md), and it will send all logs to that workspace.
+Logs for both system components and your applications are written to standard output. Both log types can be collected for analysis using standard Kubernetes tools. You can also configure the App Service cluster extension with a [Log Analytics workspace](../azure-monitor/logs/log-analytics-overview.md), and it sends all logs to that workspace.
-By default, logs from system components are sent to the Azure team. Application logs are not sent. You can prevent these logs from being transferred by setting `logProcessor.enabled=false` as an extension configuration setting. This will also disable forwarding of application to your Log Analytics workspace. Disabling the log processor may impact time needed for any support cases, and you will be asked to collect logs from standard output through some other means.
+By default, logs from system components are sent to the Azure team. Application logs are not sent. You can prevent these logs from being transferred by setting `logProcessor.enabled=false` as an extension configuration setting. This configuration setting will also disable forwarding of application to your Log Analytics workspace. Disabling the log processor might impact time needed for any support cases, and you will be asked to collect logs from standard output through some other means.
### What do I do if I see a provider registration error?
-When creating a Kubernetes environment resource, some subscriptions may see a "No registered resource provider found" error. The error details may include a set of locations and api versions that are considered valid. If this happens, it may be that the subscription needs to be re-registered with the Microsoft.Web provider, an operation which has no impact on existing applications or APIs. To re-register, use the Azure CLI to run `az provider register --namespace Microsoft.Web --wait`. Then re-attempt the Kubernetes environment command.
+When creating a Kubernetes environment resource, some subscriptions might see a "No registered resource provider found" error. The error details might include a set of locations and api versions that are considered valid. If this error message is returned, the subscription must be re-registered with the Microsoft.Web provider, an operation that has no impact on existing applications or APIs. To re-register, use the Azure CLI to run `az provider register --namespace Microsoft.Web --wait`. Then reattempt the Kubernetes environment command.
### Can I deploy the Application services extension on an ARM64 based cluster?
ARM64 based clusters are not supported at this time.
- Initial public preview release of Application services extension. - Support for code and container-based deployments of Web, Function, and Logic Applications.-- Web application runtime support - .NET 3.1 and 5.0; Node JS 12 and 14; Python 3.6, 3.7, and 3.8; PHP 7.3 and 7.4; Ruby 2.5, 2.5.5, 2.6, and 2.6.2; Java SE 8u232, 8u242, 8u252, 11.05, 11.06 and 11.07; Tomcat 8.5, 8.5.41, 8.5.53, 8.5.57, 9.0, 9.0.20, 9.0.33, and 9.0.37.
+- Web application runtime support .NET 3.1 and 5.0; Node JS 12 and 14; Python 3.6, 3.7, and 3.8; PHP 7.3 and 7.4; Ruby 2.5, 2.5.5, 2.6, and 2.6.2; Java SE 8u232, 8u242, 8u252, 11.05, 11.06 and 11.07; Tomcat 8.5, 8.5.41, 8.5.53, 8.5.57, 9.0, 9.0.20, 9.0.33, and 9.0.37.
### Application services extension v 0.10.0 (November 2021)
ARM64 based clusters are not supported at this time.
- Upgrade Azure Function runtime to v3.3.1 - Set default replica count of App Controller and Envoy Controller to 2 to add further stability
-If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension will upgrade automatically. To manually upgrade the extension to the latest version, you can run the command below:
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
```azurecli-interactive az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.10.0
If your extension was in the stable version and auto-upgrade-minor-version is se
- Added Application Insights support for Java and .NET Web Applications - Added support for .NET 6.0 Web Applications - Removed .NET Core 2.0-- Resolved issues with slot swap operations failing-- Resolved issues during Ruby app creation
+- Resolved issues that caused slot swap operations to fail
+- Resolved issues customers experienced during creation of Ruby web applications
-If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension will upgrade automatically. To manually upgrade the extension to the latest version, you can run the command below:
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
```azurecli-interactive az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.11.0 ```
+### Application services extension v 0.11.1 (December 2021)
+
+- Minor release to resolve issue with CRD update
+
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
+
+```azurecli-interactive
+ az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.11.1
+```
+
+### Application services extension v 0.12.0 (January 2022)
+
+- Support for outbound proxy
+- Support for parallel builds in build service
+- Upgrade Envoy to 1.20.1
+- Resolved issue with Application Insights support for .NET Applications
+
+If your extension was in the stable version and auto-upgrade-minor-version is set to true, the extension upgrades automatically. To manually upgrade the extension to the latest version, you can run the command:
+
+```azurecli-interactive
+ az k8s-extension update --cluster-type connectedClusters -c <clustername> -g <resource group> -n <extension name> --release-train stable --version 0.12.0
+```
+ ## Next steps [Create an App Service Kubernetes environment (Preview)](manage-create-arc-environment.md)
app-service Tutorial Connect Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/tutorial-connect-overview.md
+
+ Title: 'Securely connect to Azure resources'
+description: Your app service may need to connect to other Azure services such as a database, storage, or another app. This overview recommends the more secure method for connecting.
++ Last updated : 01/26/2022+
+# Securely connect to Azure services and databases from Azure App Service
+
+Your app service may need to connect to other Azure services such as a database, storage, or another app. This overview recommends the more secure method for connecting.
+
+|Connection method|When to use|
+|--|--|
+|[Direct connection from App Service managed identity](#connect-to-azure-services-with-managed-identity)|Dependent service [supports managed identity](/azure/active-directory/managed-identities-azure-resources/managed-identities-status)<br><br>* Best for enterprise-level security<br>* Connection to dependent service is secured with managed identity<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.|
+|[Connect using Key Vault secrets from App Service managed identity](#connect-to-key-vault-with-managed-identity)|Dependent service doesn't support managed identity<br><br>* Best for enterprise-level security<br>* Connection includes non-Azure services such as GitHub, Twitter, Facebook, Google<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.<br>* Manage connection information with environment variables.|
+|[Connect with app settings](#connect-with-app-settings)|* Best for small team or individual owner of Azure resources.<br>* Stage 1 of multi-stage migration to Azure<br>* Temporary or proof-of-concept applications<br>* Manually manage connection information with environment variables|
+
+## Connect to Azure services with managed identity
+
+Use [managed identity](/azure/active-directory/managed-identities-azure-resources/overview) to authenticate from one Azure resource, such as Azure app service, to another Azure resource whenever possible. This level of authentication lets Azure manage the authentication process, after the required setup is complete. Once the connection is set up, you won't need to manage the connection.
+
+Benefits of managed identity:
+
+* Automated credentials management
+* Many Azure services are included
+* No additional cost
+* No code changes
++
+Learn which [services](/azure/active-directory/managed-identities-azure-resources/managed-identities-status) are supported with managed identity and what [operations you can perform](/azure/active-directory/managed-identities-azure-resources/overview).
+
+### Example managed identity scenario
+
+The following image demonstrates the following an App Service connecting to other Azure
+
+* A: User visits Azure app service website.
+* B: Securely **connect from** App Service **to** another Azure service using managed identity.
+* C: Securely **connect from** App Service **to** Microsoft Graph.
++
+## Connect to Key Vault with managed identity
+
+When managed identity isn't supported for your app's dependent services, use Key Vault to store your secrets, and connect your app to Key Vault with a managed identity.
+
+Secrets include:
+
+|Secret|Example|
+|--|--|
+|Certificates|SSL certificates|
+|Keys and access tokens|Cognitive service API Key<br>GitHub personal access token<br>Twitter consumer keys and authentication tokens|
+|Connection strings|Database connection strings such as SQL server or MongoDB|
++
+Benefits of managed identity integrated with Key Vault include:
+
+* Connectivity to Key Vault is secured by managed identities
+* Access to the Key Vault is restricted to the app. App contributors, such as administrators, may have complete control of the App Service resources, and at the same time have no access to the Key Vault secrets.
+* No code change is required if your application code already accesses connection secrets with app settings.
+* Monitoring and auditing of who accessed secrets.
+* Rotation of connection information in Key Vault requires no changes in App Service.
+
+## Connect with app settings
+
+The App Service provides [App settings](configure-common.md?tabs=portal#configure-app-settings) to store connection strings, API keys, and other environment variables. While App Service does provide encryption for app settings, for enterprise-level security, consider other services to manage these types of secrets that provide additional benefits.
+
+**App settings** best used when:
+
+* Security of connection information is manual and limited to a few people
+* Web app is temporary, proof-of-concept, or in first migration stage to Azure
+
+**App Service** managed identity to another Azure service best when:
+
+* You don't need to manage Azure credentials. Credentials arenΓÇÖt even accessible to you.
+* You can use managed identities to authenticate to any resource that supports Azure Active Directory authentication including your own applications.
+* Managed identities can be used without any additional cost.
+
+**Key Vault** integration from App Service with managed identity best used when:
+
+* Connectivity to Key Vault is secured by managed identities.
+* Access to the Key Vault is restricted to the app. App contributors, such as administrators, may have complete control of the App Service resources, and at the same time have no access to the Key Vault secrets.
+* No code change is required if your application code already accesses connection secrets with app settings.
+* Monitoring and auditing of who accessed secrets.
++
+## Next steps
+
+* Learn how to use App Service managed identity with:
+ * [SQL server](tutorial-connect-msi-sql-database.md?tabs=windowsclient%2Cdotnet)
+ * [Azure storage](scenario-secure-app-access-storage.md?tabs=azure-portal%2Cprogramming-language-csharp)
+ * [Microsoft Graph](scenario-secure-app-access-microsoft-graph-as-app.md?tabs=azure-powershell%2Cprogramming-language-csharp)
app-service Tutorial Nodejs Mongodb App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/tutorial-nodejs-mongodb-app.md
Title: 'Tutorial: Node.js app with MongoDB'
-description: Learn how to get a Node.js app working in Azure, with connection to a MongoDB database in Azure (Cosmos DB). Sails.js and Angular 12 are used in the tutorial.
-
+ Title: Deploy a Node.js web app using MongoDB to Azure
+description: This article shows you have to deploy a Node.js app using Express.js and a MongoDB database to Azure. Azure App Service is used to host the web application and Azure Cosmos DB to host the database using the 100% compatible MongoDB API built into Cosmos DB.
Previously updated : 07/13/2021-
-zone_pivot_groups: app-service-platform-windows-linux
Last updated : 01/31/2022+
+ms.role: developer
+ms.devlang: javascript
+
-# Tutorial: Build a Node.js and MongoDB app in Azure
--
-[Azure App Service](overview.md) provides a highly scalable, self-patching web hosting service. This tutorial shows how to create a Node.js app in App Service on Windows and connect it to a MongoDB database. When you're done, you'll have a MEAN application (MongoDB, Express, AngularJS, and Node.js) running in [Azure App Service](overview.md). The sample application uses a combination of [Sails.js](https://sailsjs.com/) and [Angular 12](https://angular.io/).
----
-[Azure App Service](overview.md) provides a highly scalable, self-patching web hosting service using the Linux operating system. This tutorial shows how to create a Node.js app in App Service on Linux, connect it locally to a MongoDB database, then deploy it to a database in Azure Cosmos DB's API for MongoDB. When you're done, you'll have a MEAN application (MongoDB, Express, AngularJS, and Node.js) running in App Service on Linux. The sample application uses a combination of [Sails.js](https://sailsjs.com/) and [Angular 12](https://angular.io/).
--
-![MEAN app running in Azure App Service](./media/tutorial-nodejs-mongodb-app/run-in-azure.png)
-
-What you'll learn:
-
-> [!div class="checklist"]
-> * Create a MongoDB database in Azure
-> * Connect a Node.js app to MongoDB
-> * Deploy the app to Azure
-> * Update the data model and redeploy the app
-> * Stream diagnostic logs from Azure
-> * Manage the app in the Azure portal
--
-## Prerequisites
-
-To complete this tutorial:
--- [Install Git](https://git-scm.com/)-- [Install Node.js and NPM](https://nodejs.org/)-
-## Create local Node.js app
-
-In this step, you set up the local Node.js project.
+# Deploy a Node.js + MongoDB web app to Azure
-### Clone the sample application
+In this tutorial, you'll deploy a sample **Express.js** app using a **MongoDB** database to Azure. The Express.js app will be hosted in Azure App Service which supports hosting Node.js apps in both Linux (Node versions 12, 14, and 16) and Windows (versions 12 and 14) server environments. The MongoDB database will be hosted in Azure Cosmos DB, a cloud native database offering a [100% MongoDB compatible API](/azure/cosmos-db/mongodb/mongodb-introduction).
-In the terminal window, `cd` to a working directory.
-Run the following command to clone the sample repository.
+This article assumes you are already familiar with [Node.js development](/learn/paths/build-javascript-applications-nodejs/) and have Node and MongoDB installed locally. You'll also need an Azure account with an active subscription. If you do not have an Azure account, you [can create one for free](https://azure.microsoft.com/free/nodejs/).
-```bash
-git clone https://github.com/Azure-Samples/mean-todoapp.git
-```
-
-> [!NOTE]
-> For information on how the sample app is created, see [https://github.com/Azure-Samples/mean-todoapp](https://github.com/Azure-Samples/mean-todoapp).
+## Sample application
-### Run the application
-
-Run the following commands to install the required packages and start the application.
+To follow along with this tutorial, clone or download the sample application from the repository [https://github.com/Azure-Samples/msdocs-nodejs-mongodb-azure-sample-app](https://github.com/Azure-Samples/msdocs-nodejs-mongodb-azure-sample-app).
```bash
-cd mean-todoapp
-npm install
-node app.js --alter
+git clone https://github.com/Azure-Samples/msdocs-nodejs-mongodb-azure-sample-app.git
```
-When the app is fully loaded, you see something similar to the following message:
-
-<pre>
-debug: -
-debug: :: Fri Jul 09 2021 13:10:34 GMT+0200 (Central European Summer Time)
+Follow these steps to run the application locally:
-debug: Environment : development
-debug: Port : 1337
-debug: -
-</pre>
+* Install the package dependencies by running `npm install`
+* Copy the `.env.sample` file to `.env` and populate the DATABASE_URL value with your MongoDB URL (for example *mongodb://localhost:27017/*)
+* Start the application using `npm start`
+* To view the app, browse to `http://localhost:3000`
-Navigate to `http://localhost:1337` in a browser. Add a few todo items.
+## 1 - Create the Azure App Service
-The MEAN sample application stores user data in the database. By default, it uses a disk-based development database. If you can create and see todo items, then your app is reading and writing data.
+Azure App Service is used to host the Express.js web app. When setting up the App Service for the application, you will specify:
-![MEAN app loaded successfully](./media/tutorial-nodejs-mongodb-app/run-locally.png)
+* The **Name** for the web app. This name is used as part of the DNS name for your webapp in the form of `https://<app-name>.azurewebsites.net`.
+* The **Runtime** for the app. This is where you select the version of Node to use for your app.
+* The **App Service plan** which defines the compute resources (CPU, memory) available for the application.
+* The **Resource Group** for the app. A resource group lets you group all of the Azure resources needed for the application together in a logical container.
-To stop Node.js at any time, press `Ctrl+C` in the terminal.
+Azure resources can be created using the [Azure portal](https://portal.azure.com/), VS Code using the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack), or the Azure CLI.
-## Create production MongoDB
+### [Azure portal](#tab/azure-portal)
-In this step, you create a MongoDB database in Azure. When your app is deployed to Azure, it uses this cloud database.
+Sign in to the [Azure portal](https://portal.azure.com/) and follow these steps to create your Azure App Service resources.
-For MongoDB, this tutorial uses [Azure Cosmos DB](../cosmos-db/index.yml). Cosmos DB supports MongoDB client connections.
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create app service step 1](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-1-240px.png" alt-text="A screenshot showing how to use the search box in the top tool bar to find App Services in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-1.png"::: |
+| [!INCLUDE [Create app service step 2](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-2-240px.png" alt-text="A screenshot showing the create button on the App Services page used to create a new web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-2.png"::: |
+| [!INCLUDE [Create app service step 3](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-3-240px.png" alt-text="A screenshot showing the form to fill out to create a web app in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-3.png"::: |
+| [!INCLUDE [Create app service step 4](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-4-240px.png" alt-text="A screenshot of the Spec Picker dialog that allows you to select the App Service plan to use for your web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-4.png"::: |
+| [!INCLUDE [Create app service step 4](<./includes/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-5.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-5-240px.png" alt-text="A screenshot of the main web app create page showing the button to select on to create your web app in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-azure-portal-5.png"::: |
-### Create a resource group
+### [VS Code](#tab/vscode-aztools)
+To create Azure resources in VS Code, you must have the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack) installed and be signed into Azure from VS Code.
-### Create a Cosmos DB account
+> [!div class="nextstepaction"]
+> [Download Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack)
-> [!NOTE]
-> There is a cost to creating the Azure Cosmos DB databases in this tutorial in your own Azure subscription. To use a free Azure Cosmos DB account for seven days, you can use the [Try Azure Cosmos DB for free](https://azure.microsoft.com/try/cosmosdb/) experience. Just click the **Create** button in the MongoDB tile to create a free MongoDB database on Azure. Once the database is created, navigate to **Connection String** in the portal and retrieve your Azure Cosmos DB connection string for use later in the tutorial.
->
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create app service step 1](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-01.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-01-240px.png" alt-text="A screenshot showing the location of the Azure Tools icon in the left toolbar." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-01.png"::: |
+| [!INCLUDE [Create app service step 2](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-02.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-02-240px.png" alt-text="A screenshot showing the App Service section of Azure Tools showing how to create a new web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-02.png"::: |
+| [!INCLUDE [Create app service step 3](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-03.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-03-240px.png" alt-text="A screenshot showing the dialog box used to enter the name of the web app in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-03.png"::: |
+| [!INCLUDE [Create app service step 4](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-04.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-04-240px.png" alt-text="A screenshot of dialog box used to select a resource group or create a new one for the web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-04.png"::: |
+| [!INCLUDE [Create app service step 5](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-05.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-05-240px.png" alt-text="A screenshot of the dialog box in VS Code used enter a name for the resource group." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-05.png"::: |
+| [!INCLUDE [Create app service step 6](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-06.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-06-240px.png" alt-text="A screenshot of the dialog box in VS Code used to select Node 14 LTS as the runtime for the web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-06.png"::: |
+| [!INCLUDE [Create app service step 7](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-07.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-07-240px.png" alt-text="A screenshot of the dialog in VS Code used to select operating system to use for hosting the web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-07.png"::: |
+| [!INCLUDE [Create app service step 8](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-08.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-08-240px.png" alt-text="A screenshot of the dialog in VS Code used to select location of the web app resources." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-08.png"::: |
+| [!INCLUDE [Create app service step 9](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-09.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-09-240px.png" alt-text="A screenshot of the dialog in VS Code used to select an App Service plan or create a new one." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-09.png"::: |
+| [!INCLUDE [Create app service step 10](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-10.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-10-240px.png" alt-text="A screenshot of the dialog in VS Code used to enter the name of the App Service plan." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-10.png"::: |
+| [!INCLUDE [Create app service step 11](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-11.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-11-240px.png" alt-text="A screenshot of the dialog in VS Code used to select the pricing tier of the App Service plan." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-11.png"::: |
+| [!INCLUDE [Create app service step 12](<./includes/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-12.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-12-240px.png" alt-text="A screenshot of the dialog in VS Code asking if you want to create an App Insights resource for your web app." lightbox="./media/tutorial-nodejs-mongodb-app/create-app-service-visual-studio-code-12.png"::: |
-In the Cloud Shell, create a Cosmos DB account with the [`az cosmosdb create`](/cli/azure/cosmosdb#az_cosmosdb_create) command.
-
-In the following command, substitute a unique Cosmos DB name for the *\<cosmosdb-name>* placeholder. This name is used as the part of the Cosmos DB endpoint, `https://<cosmosdb-name>.documents.azure.com/`, so the name needs to be unique across all Cosmos DB accounts in Azure. The name must contain only lowercase letters, numbers, and the hyphen (-) character, and must be between 3 and 50 characters long.
-
-```azurecli-interactive
-az cosmosdb create --name <cosmosdb-name> --resource-group myResourceGroup --kind MongoDB
-```
-
-The *--kind MongoDB* parameter enables MongoDB client connections.
-
-When the Cosmos DB account is created, the Azure CLI shows information similar to the following example:
-
-<pre>
-{
- "apiProperties": {
- "serverVersion": "3.6"
- },
- "backupPolicy": {
- "periodicModeProperties": {
- "backupIntervalInMinutes": 240,
- "backupRetentionIntervalInHours": 8,
- "backupStorageRedundancy": "Geo"
- },
- "type": "Periodic"
- },
- "capabilities": [
- {
- "name": "EnableMongo"
- }
- ],
- "connectorOffer": null,
- "consistencyPolicy": {
- "defaultConsistencyLevel": "Session",
- "maxIntervalInSeconds": 5,
- "maxStalenessPrefix": 100
- },
- "cors": [],
- "databaseAccountOfferType": "Standard",
- "defaultIdentity": "FirstPartyIdentity",
- "disableKeyBasedMetadataWriteAccess": false,
- "documentEndpoint": "https://&lt;cosmosdb-name&gt;.documents.azure.com:443/",
- ...
- &lt; Output truncated for readability &gt;
-}
-</pre>
-
-## Connect app to production MongoDB
-
-In this step, you connect your sample application to the Cosmos DB database you just created, using a MongoDB connection string.
-
-### Retrieve the database key
-
-To connect to the Cosmos DB database, you need the database key. In the Cloud Shell, use the [`az cosmosdb keys list`](/cli/azure/cosmosdb#az_cosmosdb_keys_list) command to retrieve the primary key.
-
-```azurecli-interactive
-az cosmosdb keys list --name <cosmosdb-name> --resource-group myResourceGroup
-```
+### [Azure CLI](#tab/azure-cli)
-The Azure CLI shows information similar to the following example:
-<pre>
-{
- "primaryMasterKey": "RS4CmUwzGRASJPMoc0kiEvdnKmxyRILC9BWisAYh3Hq4zBYKr0XQiSE4pqx3UchBeO4QRCzUt1i7w0rOkitoJw==",
- "primaryReadonlyMasterKey": "HvitsjIYz8TwRmIuPEUAALRwqgKOzJUjW22wPL2U8zoMVhGvregBkBk9LdMTxqBgDETSq7obbwZtdeFY7hElTg==",
- "secondaryMasterKey": "Lu9aeZTiXU4PjuuyGBbvS1N9IRG3oegIrIh95U6VOstf9bJiiIpw3IfwSUgQWSEYM3VeEyrhHJ4rn3Ci0vuFqA==",
- "secondaryReadonlyMasterKey": "LpsCicpVZqHRy7qbMgrzbRKjbYCwCKPQRl0QpgReAOxMcggTvxJFA94fTi0oQ7xtxpftTJcXkjTirQ0pT7QFrQ=="
-}
-</pre>
-
-Copy the value of `primaryMasterKey`. You need this information in the next step.
-
-<a name="devconfig"></a>
-### Configure the connection string in your sample application
-
-In your local repository, in _config/datastores.js_, replace the existing content with the following code and save your changes.
-
-```javascript
-module.exports.datastores = {
- default: {
- adapter: 'sails-mongo',
- url: process.env.MONGODB_URI,
- ssl: true,
- },
-};
-```
-
-The `ssl: true` option is required because [Cosmos DB requires TLS/SSL](../cosmos-db/connect-mongodb-account.md#connection-string-requirements). `url` is set to an environment variable, which you will set next.
-
-In the terminal, set the `MONGODB_URI` environment variable. Be sure to replace the two \<cosmosdb-name> placeholders with your Cosmos DB database name, and replace the \<cosmosdb-key> placeholder with the key you copied in the previous step.
-
-```bash
-export MONGODB_URI=mongodb://<cosmosdb-name>:<cosmosdb-key>@<cosmosdb-name>.documents.azure.com:10250/todoapp
-```
-
-> [!NOTE]
-> This connection string follows the format defined in the [Sails.js documentation](https://sailsjs.com/documentation/reference/configuration/sails-config-datastores#?the-connection-url).
-
-### Test the application with MongoDB
-
-In a local terminal window, run `node app.js --alter` again.
-
-```bash
-node app.js --alter
-```
-
-Navigate to `http://localhost:1337` again. If you can create and see todo items, then your app is reading and writing data using the Cosmos DB database in Azure.
-
-In the terminal, stop Node.js by typing `Ctrl+C`.
-
-## Deploy app to Azure
-
-In this step, you deploy your MongoDB-connected Node.js application to Azure App Service.
+
-### Configure a deployment user
+## 2 - Create an Azure Cosmos DB in MongoDB compatibility mode
+Azure Cosmos DB is a fully managed NoSQL database for modern app development. Among its features is a 100% MongoDB compatible API allowing you to use your existing MongoDB tools, packages, and applications with Cosmos DB.
-### Create an App Service plan
+### [Azure portal](#tab/azure-portal)
+You must be signed in to the [Azure portal](https://portal.azure.com/) to complete these steps to create a Cosmos DB.
-In the Cloud Shell, create an App Service plan with the [`az appservice plan create`](/cli/azure/appservice/plan) command.
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create Cosmos DB step 1](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-1-240px.png" alt-text="A screenshot showing how to use the search box in the top tool bar to find Cosmos DB in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-1.png"::: |
+| [!INCLUDE [Create Cosmos DB step 2](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-2-240px.png" alt-text="A screenshot showing the create button on the Cosmos DB page used to create a database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-2.png"::: |
+| [!INCLUDE [Create Cosmos DB step 3](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-3-240px.png" alt-text="A screenshot showing the page where you select the MongoDB API for your Cosmos DB." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-3.png"::: |
+| [!INCLUDE [Create Cosmos DB step 4](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-4-240px.png" alt-text="A screenshot showing how to fill out the page to create a new Cosmos DB." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-azure-portal-4.png"::: |
-The following example creates an App Service plan named `myAppServicePlan` in the **B1** pricing tier:
+### [VS Code](#tab/vscode-aztools)
-```azurecli-interactive
-az appservice plan create --name myAppServicePlan --resource-group myResourceGroup --sku B1
-```
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Create Cosmos DB step 1](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-1-240px.png" alt-text="A screenshot showing the databases component of the Azure Tools VS Code extension and the location of the button to create a new database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-1.png"::: |
+| [!INCLUDE [Create Cosmos DB step 2](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-2-240px.png" alt-text="A screenshot showing the dialog box used to select the subscription for the new database in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-2.png"::: |
+| [!INCLUDE [Create Cosmos DB step 3](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-3-240px.png" alt-text="A screenshot showing the dialog box used to select the type of database you want to create in Azure." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-3.png"::: |
+| [!INCLUDE [Create Cosmos DB step 4](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-4-240px.png" alt-text="A screenshot of dialog box used to enter the name of the new database in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-4.png"::: |
+| [!INCLUDE [Create Cosmos DB step 5](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-5.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-5-240px.png" alt-text="A screenshot of the dialog to select the throughput mode of the database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-5.png"::: |
+| [!INCLUDE [Create Cosmos DB step 6](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-6.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-6-240px.png" alt-text="A screenshot of the dialog in VS Code used to select resource group to put the new database in." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-6.png"::: |
+| [!INCLUDE [Create Cosmos DB step 7](<./includes/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-7.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-7-240px.png" alt-text="A screenshot of the dialog in VS Code used to select location for the new database." lightbox="./media/tutorial-nodejs-mongodb-app/create-cosmos-db-visual-studio-code-7.png"::: |
-When the App Service plan has been created, the Azure CLI shows information similar to the following example:
+### [Azure CLI](#tab/azure-cli)
-<pre>
-{
- "freeOfferExpirationTime": null,
- "geoRegion": "UK West",
- "hostingEnvironmentProfile": null,
- "hyperV": false,
- "id": "/subscriptions/0000-0000/resourceGroups/myResourceGroup/providers/Microsoft.Web/serverfarms/myAppServicePlan",
- "isSpot": false,
- "isXenon": false,
- "kind": "app",
- "location": "ukwest",
- "maximumElasticWorkerCount": 1,
- "maximumNumberOfWorkers": 0,
- &lt; JSON data removed for brevity. &gt;
-}
-</pre>
+
+## 3 - Connect your App Service to your Cosmos DB
-In the Cloud Shell, create an App Service plan with the [`az appservice plan create`](/cli/azure/appservice/plan) command.
+To connect to your Cosmos DB database, you need to provide the connection string for the database to your application. This is done in the sample application by reading the `DATABASE_URL` environment variable. When running locally, the sample application uses the [dotenv package](https://www.npmjs.com/package/dotenv) to read the connection string value from the `.env` file.
-<!-- [!INCLUDE [app-service-plan](app-service-plan.md)] -->
+When running in Azure, configuration values like connection strings can be stored in the *application settings* of the App Service hosting the web app. These values are then made available to your application as environment variables during runtime. In this way, the application accesses the connection string from `process.env` the same way whether being run locally or in Azure. Further, this eliminates the need to manage and deploy environment specific config files with your application.
-The following example creates an App Service plan named `myAppServicePlan` in the **B1** pricing tier:
+### [Azure portal](#tab/azure-portal)
-```azurecli-interactive
-az appservice plan create --name myAppServicePlan --resource-group myResourceGroup --sku B1 --is-linux
-```
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Connection string step 1](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-1-240px.png" alt-text="A screenshot showing the location of the Cosmos DB connection string on the Cosmos DB quick start page." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-1.png"::: |
+| [!INCLUDE [Connection string step 2](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-2-240px.png" alt-text="A screenshot showing how to search for and navigate to the App Service where the connection string needs to store the connection string." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-2.png"::: |
+| [!INCLUDE [Connection string step 3](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-3-240px.png" alt-text="A screenshot showing how to access the Application settings within an App Service." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-3.png"::: |
+| [!INCLUDE [Connection string step 4](<./includes/tutorial-nodejs-mongodb-app/connection-string-azure-portal-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-4-240px.png" alt-text="A screenshot showing the dialog used to set an application setting in Azure App Service." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-azure-portal-4.png"::: |
-When the App Service plan has been created, the Azure CLI shows information similar to the following example:
+### [VS Code](#tab/vscode-aztools)
-<pre>
-{
- "freeOfferExpirationTime": null,
- "geoRegion": "West Europe",
- "hostingEnvironmentProfile": null,
- "id": "/subscriptions/0000-0000/resourceGroups/myResourceGroup/providers/Microsoft.Web/serverfarms/myAppServicePlan",
- "kind": "linux",
- "location": "West Europe",
- "maximumNumberOfWorkers": 1,
- "name": "myAppServicePlan",
- &lt; JSON data removed for brevity. &gt;
- "targetWorkerSizeId": 0,
- "type": "Microsoft.Web/serverfarms",
- "workerTierName": null
-}
-</pre>
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Connection string step 1](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-1-240px.png" alt-text="A screenshot showing how to copy the connection string for a Cosmos database to your clipboard in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-1.png"::: |
+| [!INCLUDE [Connection string step 2](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-2-240px.png" alt-text="A screenshot showing how to add a config setting to an App Service in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-2.png"::: |
+| [!INCLUDE [Connection string step 3](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-3-240px.png" alt-text="A screenshot showing the dialog box used to give a name to an app setting in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-3.png"::: |
+| [!INCLUDE [Connection string step 4](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-4-240px.png" alt-text="A screenshot showing the dialog used to set the value of an app setting in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-4.png"::: |
+| [!INCLUDE [Connection string step 4](<./includes/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-5.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-5-240px.png" alt-text="A screenshot showing how to view an app setting for an App Service in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/connection-string-visual-studio-code-5.png"::: |
+### [Azure CLI](#tab/azure-cli)
-<a name="create"></a>
-### Create a web app
-+
+## 4 - Deploy application code to Azure
+Azure App service supports multiple methods to deploy your application code to Azure including support for GitHub Actions and all major CI/CD tools. This article focuses on how to deploy your code from your local workstation to Azure.
+### [Deploy using VS Code](#tab/vscode-deploy)
+To deploy your application code directly from VS Code, you must have the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack) installed and be signed into Azure from VS Code.
-### Configure an environment variable
+> [!div class="nextstepaction"]
+> [Download Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack)
-Remember that the sample application is already configured to use the `MONGODB_URI` environment variable in `config/datastores.js`. In App Service, you inject this variable by using an [app setting](configure-common.md#configure-app-settings).
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Deploy from VS Code 1](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-1-240px.png" alt-text="A screenshot showing the location of the Azure Tool icon in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-1.png"::: |
+| [!INCLUDE [Deploy from VS Code 2](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-2-240px.png" alt-text="A screenshot showing how you deploy an application to Azure by right-clicking on a web app in VS Code and selecting deploy from the context menu." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-2.png"::: |
+| [!INCLUDE [Deploy from VS Code 3](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-3-240px.png" alt-text="A screenshot showing the dialog box used to select the deployment directory in VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-3.png"::: |
+| [!INCLUDE [Deploy from VS Code 3](<./includes/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-4.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-4-240px.png" alt-text="A screenshot showing the Output window of VS Code while deploying an application to Azure." lightbox="./media/tutorial-nodejs-mongodb-app/deploy-visual-studio-code-4.png"::: |
-To set app settings, use the [`az webapp config appsettings set`](/cli/azure/webapp/config/appsettings#az_webapp_config_appsettings_set) command in the Cloud Shell.
-The following example configures a `MONGODB_URI` app setting in your Azure app. Replace the *\<app-name>*, *\<cosmosdb-name>*, and *\<cosmosdb-key>* placeholders.
+### [Deploy using Local Git](#tab/local-git-deploy)
-```azurecli-interactive
-az webapp config appsettings set --name <app-name> --resource-group myResourceGroup --settings MONGODB_URI='mongodb://<cosmosdb-name>:<cosmosdb-key>@<cosmosdb-name>.documents.azure.com:10250/todoapp' DEPLOYMENT_BRANCH='main'
-```
-> [!NOTE]
-> `DEPLOYMENT_BRANCH` is a special app setting that tells the deployment engine which Git branch you're deploying to in App Service.
-
-### Push to Azure from Git
---
-<pre>
-Enumerating objects: 5, done.
-Counting objects: 100% (5/5), done.
-Delta compression using up to 8 threads
-Compressing objects: 100% (3/3), done.
-Writing objects: 100% (3/3), 318 bytes | 318.00 KiB/s, done.
-Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
-remote: Updating branch 'main'.
-remote: Updating submodules.
-remote: Preparing deployment for commit id '4eb0ca7190'.
-remote: Generating deployment script.
-remote: Running deployment command...
-remote: Handling node.js deployment.
-remote: Creating app_offline.htm
-remote: KuduSync.NET from: 'D:\home\site\repository' to: 'D:\home\site\wwwroot'
-remote: Copying file: 'package.json'
-remote: Deleting app_offline.htm
-remote: Looking for app.js/server.js under site root.
-remote: Using start-up script app.js
-remote: Generated web.config.
-.
-.
-.
-remote: Deployment successful.
-To https://&lt;app-name&gt;.scm.azurewebsites.net/&lt;app-name&gt;.git
- * [new branch]      main -> main
-</pre>
-
-> [!TIP]
-> During Git deployment, the deployment engine runs `npm install --production` as part of its build automation.
->
-> - As defined in `package.json`, the `postinstall` script is picked up by `npm install` and runs `ng build` to generate the production files for Angular and deploy them to the [assets](https://sailsjs.com/documentation/concepts/assets) folder.
-> - `scripts` in `package.json` can use tools that are installed in `node_modules/.bin`. Since `npm install` has installed `node_modules/.bin/ng` too, you can use it to deploy your Angular client files. This npm behavior is exactly the same in Azure App Service.
-> Packages under `devDependencies` in `package.json` are not installed. Any package you need in the production environment needs to be moved under `dependencies`.
->
-> If your app needs to bypass the default automation and run custom automation, see [Run Grunt/Bower/Gulp](configure-language-nodejs.md#run-gruntbowergulp).
---
-<pre>
-Enumerating objects: 5, done.
-Counting objects: 100% (5/5), done.
-Delta compression using up to 8 threads
-Compressing objects: 100% (3/3), done.
-Writing objects: 100% (3/3), 347 bytes | 347.00 KiB/s, done.
-Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
-remote: Deploy Async
-remote: Updating branch 'main'.
-remote: Updating submodules.
-remote: Preparing deployment for commit id 'f776be774a'.
-remote: Repository path is /home/site/repository
-remote: Running oryx build...
-remote: Operation performed by Microsoft Oryx, https://github.com/Microsoft/Oryx
-remote: You can report issues at https://github.com/Microsoft/Oryx/issues
-remote:
-remote: Oryx Version: 0.2.20210420.1, Commit: 85c6e9278aae3980b86cb1d520aaad532c814ed7, ReleaseTagName: 20210420.1
-remote:
-remote: Build Operation ID: |qwejn9R4StI=.5e8a3529_
-remote: Repository Commit : f776be774a3ea8abc48e5ee2b5132c037a636f73
-.
-.
-.
-remote: Deployment successful.
-remote: Deployment Logs : 'https://&lt;app-name&gt;.scm.azurewebsites.net/newui/jsonviewer?view_url=/api/deployments/a6fcf811136739f145e0de3be82ff195bca7a68b/log'
-To https://&lt;app-name&gt;.scm.azurewebsites.net/&lt;app-name&gt;.git
- 4f7e3ac..a6fcf81 main -> main
-</pre>
-
-> [!TIP]
-> During Git deployment, the deployment engine runs `npm install` as part of its build automation.
->
-> - As defined in `package.json`, the `postinstall` script is picked up by `npm install` and runs `ng build` to generate the production files for Angular and deploy them to the [assets](https://sailsjs.com/documentation/concepts/assets) folder.
-> - `scripts` in `package.json` can use tools that are installed in `node_modules/.bin`. Since `npm install` has installed `node_modules/.bin/ng` too, you can use it to deploy your Angular client files. This npm behavior is exactly the same in Azure App Service.
-> When build automation is complete, the whole completed repository is copied into the `/home/site/wwwroot` folder, out of which your app is hosted.
->
-> If your app needs to bypass the default automation and run custom automation, see [Run Grunt/Bower/Gulp](configure-language-nodejs.md#run-gruntbowergulp).
--
-### Browse to the Azure app
-
-Browse to the deployed app using your web browser.
-
-```bash
-https://<app-name>.azurewebsites.net
-```
-
-If you can create and see todo items in the browser, then your sample app in Azure has connectivity to the MongoDB (Cosmos DB) database.
-
-![MEAN app running in Azure App Service](./media/tutorial-nodejs-mongodb-app/run-in-azure.png)
-
-**Congratulations!** You're running a data-driven Node.js app in Azure App Service.
-
-## Update data model and redeploy
-
-In this step, you change the `Todo` data model and publish your change to Azure.
-
-### Update the server-side model
-
-In Sails.js, changing the server-side model and API code is as simple as changing the data model, because [Sails.js already defines the common routes](https://sailsjs.com/documentation/concepts/blueprints/blueprint-routes#?restful-routes) for a model by default.
-
-In your local repository, open _api/models/Todo.js_ and add a `done` attribute. When you're done, your schema code should look like this:
-
-```javascript
-module.exports = {
-
- attributes: {
- value: {type: 'string'},
- done: {type: 'boolean', defaultsTo: false}
- },
-
-};
-```
+### [Deploy using a ZIP file](#tab/azure-cli-deploy)
-### Update the client code
-There are three files you need to modify: the client model, the HTML template, and the component file.
+
-Open _client/src/app/todo.ts_ and add a `done` property. When you're done, your model show look like this:
+## 5 - Browse to the application
-```typescript
-export class Todo {
- id!: String;
- value!: String;
- done!: Boolean;
-}
-```
+The application will have a url of the form `https://<app name>.azurewebsites.net`. Browse to this URL to view the application.
-Open _client/src/app/app.component.html_. Just above the only `<span>` element, add the following code to add a checkbox at the beginning of each todo item:
+Use the form elements in the application to add and complete tasks.
-```html
-<input class="form-check-input me-2" type="checkbox" [checked]="todo.done" (click)="toggleDone(todo.id, i)" [disabled]="isProcessing">
-```
+![A screenshot showing the application running in a browser.](./media/tutorial-nodejs-mongodb-app/sample-app-in-browser.png)
-Open _client/src/app/app.component.ts_. Just above the last closing curly brace (`}`), insert the following method. It's called by the template code above when the checkbox is clicked and updates the server-side data.
-
-```typescript
-toggleDone(id:any, i:any) {
- console.log("Toggled checkbox for " + id);
- this.isProcessing = true;
- this.Todos[i].done = !this.Todos[i].done;
- this.restService.updateTodo(id, this.Todos[i])
- .subscribe((res) => {
- console.log('Data updated successfully!');
- this.isProcessing = false;
- }, (err) => {
- console.log(err);
- this.Todos[i].done = !this.Todos[i].done;
- });
-}
-```
+## 6 - Configure and view application logs
-### Test your changes locally
+Azure App Service captures all messages logged to the console to assist you in diagnosing issues with your application. The sample app outputs console log messages in each of its endpoints to demonstrate this capability. For example, the `get` endpoint outputs a message about the number of tasks retrieved from the database and an error message if something goes wrong.
-In the local terminal window, compile the updated Angular client code with the build script defined in `package.json`.
-```bash
-npm run build
-```
-
-Test your changes with `node app.js --alter` again. Since you changed your server-side model, the `--alter` flag lets `Sails.js` alter the data structure in your Cosmos DB database.
-
-```bash
-node app.js --alter
-```
+The contents of the App Service diagnostic logs can be reviewed in the Azure portal, VS Code, or using the Azure CLI.
-Navigate to `http://localhost:1337`. You should now see a checkbox in front of todo item. When you select or clear a checkbox, the Cosmos DB database in Azure is updated to indicate that the todo item is done.
+### [Azure portal](#tab/azure-portal)
-![Added Done data and UI](./media/tutorial-nodejs-mongodb-app/added-done.png)
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Stream logs from Azure portal 1](<./includes/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-1-240px.png" alt-text="A screenshot showing the location of the Azure Tool icon in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-1.png"::: |
+| [!INCLUDE [Stream logs from Azure portal 2](<./includes/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-azure-portal-2-240px.png" alt-text="A screenshot showing how you deploy an application to Azure by right-clicking on a web app in VS Code and selecting deploy from the context menu." lightbox="./media/tutorial-nodejs-mongodb-app//stream-logs-azure-portal-2.png"::: |
-In the terminal, stop Node.js by typing `Ctrl+C`.
+### [VS Code](#tab/vscode-aztools)
-### Publish changes to Azure
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Stream logs from VS Code 1](<./includes/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-1-240px.png" alt-text="A screenshot showing the location of the Azure Tool icon in Visual Studio Code." lightbox="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-1.png"::: |
+| [!INCLUDE [Stream logs from VS Code 2](<./includes/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-2-240px.png" alt-text="A screenshot showing how you deploy an application to Azure by right-clicking on a web app in VS Code and selecting deploy from the context menu." lightbox="./media/tutorial-nodejs-mongodb-app/stream-logs-visual-studio-code-2.png"::: |
-In the local terminal window, commit your changes in Git, then push the code changes to Azure.
+### [Azure CLI](#tab/azure-cli)
-```bash
-git commit -am "added done field"
-git push azure main
-```
-Once the `git push` is complete, navigate to your Azure app and try out the new functionality.
+
-![Model and database changes published to Azure](media/tutorial-nodejs-mongodb-app/added-done-published.png)
+## 7 - Inspect deployed files using Kudu
-If you added any articles earlier, you still can see them. Existing data in your Cosmos DB is not lost. Also, your updates to the data schema and leaves your existing data intact.
+Azure App Service provides a web-based diagnostics console named [Kudu](/azure/app-service/resources-kudu) that allows you to examine the server hosting environment for your web app. Using Kudu, you can view the files deployed to Azure, review the deployment history of the application, and even open an SSH session into the hosting environment.
-## Stream diagnostic logs
+To access Kudu, navigate to one of the following URLs. You will need to sign into the Kudu site with your Azure credentials.
+* For apps deployed in Free, Shared, Basic, Standard, and Premium App Service plans - `https://<app-name>.scm.azurewebsites.net`
+* For apps deployed in Isolated service plans - `https://<app-name>.scm.<ase-name>.p.azurewebsites.net`
-While your Node.js application runs in Azure App Service, you can get the console logs piped to your terminal. That way, you can get the same diagnostic messages to help you debug application errors.
+From the main page in Kudu, you can access information about the application hosting environment, app settings, deployments, and browse the files in the wwwroot directory.
-To start log streaming, use the [`az webapp log tail`](/cli/azure/webapp/log#az_webapp_log_tail) command in the Cloud Shell.
+![A screenshot of the main page in the Kudu SCM app showing the different information available about the hosting environment.](./media/tutorial-nodejs-mongodb-app/kudu-main-page.png)
-```azurecli-interactive
-az webapp log tail --name <app-name> --resource-group myResourceGroup
-```
+Selecting the *Deployments* link under the REST API header will show you a history of deployments of your web app.
-Once log streaming has started, refresh your Azure app in the browser to get some web traffic. You now see console logs piped to your terminal.
+![A screenshot of the deployments JSON in the Kudu SCM app showing the history of deployments to this web app.](./media/tutorial-nodejs-mongodb-app/kudu-deployments-list.png)
-Stop log streaming at any time by typing `Ctrl+C`.
+Selecting the *Site wwwroot* link under the Browse Directory heading allows you to browse and view the files on the web server.
+![A screenshot of files in the wwwroot directory showing how Kudu allows you to see what has been deployed to Azure.](./media/tutorial-nodejs-mongodb-app/kudu-wwwroot-files.png)
+## Clean up resources
+When you are finished, you can delete all of the resources from Azure by deleting the resource group for the application.
+### [Azure portal](#tab/azure-portal)
-## Manage your Azure app
+Follow these steps while signed-in to the Azure portal to delete a resource group.
-Go to the [Azure portal](https://portal.azure.com) to see the app you created.
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Remove resource group Azure portal 1](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-1-240px.png" alt-text="A screenshot showing how to search for and navigate to a resource group in the Azure portal." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-1.png"::: |
+| [!INCLUDE [Remove resource group Azure portal 2](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-2-240px.png" alt-text="A screenshot showing the location of the Delete Resource Group button in the Azure portal." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-2.png"::: |
+| [!INCLUDE [Remove resource group Azure portal 3](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-3.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-3-240px.png" alt-text="A screenshot of the confirmation dialog for deleting a resource group in the Azure portal." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-azure-portal-3.png"::: |
-From the left menu, click **App Services**, then click the name of your Azure app.
+### [VS Code](#tab/vscode-aztools)
-![Portal navigation to Azure app](./media/tutorial-nodejs-mongodb-app/access-portal.png)
+| Instructions | Screenshot |
+|:-|--:|
+| [!INCLUDE [Remove resource group VS Code 1](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-1.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-1-240px.png" alt-text="A screenshot showing how to delete a resource group in VS Code using the Azure Tools extention." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-1.png"::: |
+| [!INCLUDE [Remove resource group VS Code 2](<./includes/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-2.md>)] | :::image type="content" source="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-2-240px.png" alt-text="A screenshot of the confirmation dialog for deleting a resource group from VS Code." lightbox="./media/tutorial-nodejs-mongodb-app/remove-resource-group-visual-studio-code-2.png"::: |
-By default, the portal shows your app's **Overview** page. This page gives you a view of how your app is doing. Here, you can also perform basic management tasks like browse, stop, start, restart, and delete. The tabs on the left side of the page show the different configuration pages you can open.
+### [Azure CLI](#tab/azure-cli)
-![App Service page in Azure portal](./media/tutorial-nodejs-mongodb-app/web-app-blade.png)
+
-<a name="next"></a>
## Next steps
-What you learned:
-
-> [!div class="checklist"]
-> * Create a MongoDB database in Azure
-> * Connect a Node.js app to MongoDB
-> * Deploy the app to Azure
-> * Update the data model and redeploy the app
-> * Stream logs from Azure to your terminal
-> * Manage the app in the Azure portal
-
-Advance to the next tutorial to learn how to map a custom DNS name to the app.
-
-> [!div class="nextstepaction"]
-> [Map an existing custom DNS name to Azure App Service](app-service-web-tutorial-custom-domain.md)
-
-Or, check out other resources:
+> [!div class="nextstepaction"]
+> [JavaScript on Azure developer center](/azure/developer/javascript)
-- [Configure Node.js app](configure-language-nodejs.md)-- [Environment variables and app settings reference](reference-app-settings.md)
+> [!div class="nextstepaction"]
+> [Configure Node.js app in App Service](/azure/app-service/configure-language-nodejs)
automation Source Control Integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/source-control-integration.md
Source control integration lets you easily collaborate with your team, track changes, and roll back to earlier versions of your runbooks. For example, source control allows you to synchronize different branches in source control with your development, test, and production Automation accounts. > [!NOTE]
-> Source control synchronization jobs are run under the user's Automation account and are billed at the same rate as other Automation jobs.
+> Source control synchronization jobs are run under the user's Automation account and are billed at the same rate as other Automation jobs. Additionally, Azure Automation Jobs do not support MFA (Multi-Factor Authentication).
## Source control types
automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/troubleshoot/hybrid-runbook-worker.md
The following issues are possible causes:
* There's a mistyped workspace ID or workspace key (primary) in the agent's settings. * The Hybrid Runbook Worker can't download the configuration, which causes an account linking error. When Azure enables features on machines, it supports only certain regions for linking a Log Analytics workspace and an Automation account. It's also possible that an incorrect date or time is set on the computer. If the time is +/- 15 minutes from the current time, feature deployment fails.
+* Log Analytics Gateway is not configured to support Hybrid Runbook Worker.
#### Resolution
To verify if the agent's workspace ID or workspace key was mistyped, see [Adding
##### Configuration not downloaded
-Your Log Analytics workspace and Automation account must be in a linked region. For a list of supported regions, see [Azure Automation and Log Analytics workspace mappings](../how-to/region-mappings.md).
+Your Log Analytics workspace and Automation account must be in a linked region. This is the suggested solution for System Hybrid Runbook Worker used by Update Management. For a list of supported regions, see [Azure Automation and Log Analytics workspace mappings](../how-to/region-mappings.md).
You might also need to update the date or time zone of your computer. If you select a custom time range, make sure that the range is in UTC, which can differ from your local time zone.
+##### Log Analytics gateway not configured
+
+Follow the steps mentioned [here](/azure/azure-monitor/agents/gateway#configure-for-automation-hybrid-runbook-workers) to add Hybrid Runbook Worker endpoints to the Log Analytics Gateway.
++ ### <a name="set-azstorageblobcontent-execution-fails"></a>Scenario: Set-AzStorageBlobContent fails on a Hybrid Runbook Worker #### Issue
azure-arc Azure Data Studio Dashboards https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/azure-data-studio-dashboards.md
[Azure Data Studio](/sql/azure-data-studio/what-is) provides an experience similar to the Azure portal for viewing information about your Azure Arc resources. These views are called **dashboards** and have a layout and options similar to what you could see about a given resource in the Azure portal, but give you the flexibility of seeing that information locally in your environment in cases where you don't have a connection available to Azure. -
-## Connecting to a data controller
+## Connect to a data controller
### Prerequisites - Download [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio) - Azure Arc extension is installed
+### Connect
+
+1. Open Azure Data Studio.
+2. Select the **Connections** tab on the left.
+3. Expand the panel called **Azure Arc Controllers**.
+4. Select the **Connect Controller** button.
+ Azure Data Studio opens a blade on the right side.
-### Connect
+1. Enter the **Namespace** for the data controller.
-1. Open Azure Data Studio
-2. Select the **Connections** tab on the left
-3. Expand the panel called **Azure Arc Controllers**
-4. Click the **Connect Controller** button. This will open a blade on the right side
-5. By default, Azure Data Studio will try to read from the kube.config file in your default directory and list the available kubernetes cluster contexts and pre-select the current cluster context. If this is the right cluster to connect to, enter the namespace where the Azure Arc data controller is deployed in the input for **Namespace**. If you need to retrieve the namespace where the Azure Arc data controller is deployed, you can run ```kubectl get datacontrollers -A``` on your kubernetes cluster.
-6. Optionally add a display name for the Azure Arc data controller in the input for **Name**
-7. Select **Connect**
+ Azure Data Studio reads from the `kube.config` file in your default directory and lists the available Kubernetes cluster contexts. It selects the current cluster context. If this is the right cluster to connect to, use that namespace.
+ If you need to retrieve the namespace where the Azure Arc data controller is deployed, you can run `kubectl get datacontrollers -A` on your Kubernetes cluster.
-Now that you are connected to a data controller, you can view the dashboards for the data controller and any SQL managed instances or PostgreSQL Hyperscale server group resources that you have.
+6. Optionally add a display name for the Azure Arc data controller in the input for **Name**.
+7. Select **Connect**.
-## View the Data Controller dashboard
+
+After you connect to a data controller, you can view the dashboards. Azure Data Studio has dashboards for the data controller and any SQL managed instances or PostgreSQL Hyperscale server group resources that you have.
+
+## View the data controller dashboard
Right-click on the data controller in the Connections panel in the **Arc Controllers** expandable panel and choose **Manage**.
Conveniently, you can launch the creation of a SQL managed instance or PostgreSQ
You can also open the Azure portal in context to this data controller by clicking the Open in Azure portal button.
-## View the SQL managed instance dashboards
+## View the SQL Managed Instance dashboards
-If you have created some SQL managed instances, you can see them listed in the Connections panel in the Azure Data Controllers expandable panel underneath the data controller that is managing them.
+If you have created some SQL Managed Instances, see them listed under **Connections** in the **Azure Data Controllers** expandable panel underneath the data controller that is managing them.
-To view the SQL managed instance dashboard for a given instance, right-click on the instance and choose Manage.
+To view the SQL Managed Instance dashboard for a given instance, right-click on the instance and choose **Manage**.
-The Connection panel will pop up on the right and prompt you for the login/password to connect to that SQL instance. If you know the connection information you can enter it and click Connect. If you don't know, you can click Cancel. Either way, you will be brought to the dashboard when the Connection panel closes.
+The **Connection** panel prompts you for the login and password to connect to an instance. If you know the connection information you can enter it and choose **Connect**. If you don't know, choose **Cancel**. Either way, Azure Data Studio returns to the dashboard when the **Connection** panel closes.
-On the Overview tab you can view details about the SQL managed instance such as resource group, data controller, subscription ID, status, region and more. You can also see link that you can click to go into the Grafana or Kibana dashboards in context to that SQL managed instance.
+On the **Overview** tab, view resource group, data controller, subscription ID, status, region, and other information. This location also provides links to the Grafana dashboard for viewing metrics or Kibana dashboard for viewing logs in context to that SQL managed instance.
-If you are able to connect to the SQL manage instance, you can see additional information here.
+With a connection to the SQL manage instance, you can see additional information here.
You can delete the SQL managed instance from here or open the Azure portal to view the SQL managed instance in the Azure portal.
-If you click on the Connection Strings tab on the left, you can see a list of pre-constructed connection strings for that SQL managed instance making it easy for you to copy/paste into various other applications or code.
+If you click on the **Connection Strings** tab, the Azure Data Studio presents a list of pre-constructed connection strings for that instance making. Copy and paste these strings into various other applications or code.
## View the PostgreSQL Hyperscale server group dashboards
-If you have created some PostgreSQL Hyperscale server groups, you can see them listed in the Connections panel in the Azure Data Controllers expandable panel underneath the data controller that is managing them.
+If the deployment includes PostgreSQL Hyperscale server groups, Azure Data Studio lists them in the **Connections** panel in the **Azure Data Controllers** expandable panel underneath the data controller that is managing them.
To view the PostgreSQL Hyperscale server group dashboard for a given server group, right-click on the server group and choose Manage.
-On the Overview tab you can view details about the server group such as resource group, data controller, subscription ID, status, region and more. You can also see link that you can click to go into the Grafana or Kibana dashboards in context to that server group.
+On the **Overview** tab, review details about the server group such as resource group, data controller, subscription ID, status, region and more. The tab also has links to the Grafana dashboard for viewing metrics or Kibana dashboard for viewing logs in context to that server group.
You can delete the server group from here or open the Azure portal to view the server group in the Azure portal.
-If you click on the Connection Strings tab on the left, you can see a list of pre-constructed connection strings for that server group making it easy for you to copy/paste into various other applications or code.
+If you click on the **Connection Strings** tab on the left, Azure Data Studio provides pre-constructed connection strings for that server group. Copy and paste these strings to various other applications or code.
+
+Select the **Properties** tab on the left to see additional details.
+
+The **Resource health** tab on the left displays the current health of that server group.
-If you click on the Properties tab on the left, you can see additional details.
+The **Diagnose and solve problems** tab on the left, launches the PostgreSQL troubleshooting notebook.
-If you click on the Resource health tab on the left you can see the current high-level health of that server group.
+For Azure support, select the **New support request** tab. This launches the Azure portal in context to the server group. Create an Azure support request from there.
-If you click on the Diagnose and solve problems tab on the left, you can launch the PostgreSQL troubleshooting notebook.
+## Next steps
-If you click on the New support request tab on the left, you can launch the Azure portal in context to the server group and create an Azure support request from there.
+- [View SQL Managed Instance in the Azure portal](view-arc-data-services-inventory-in-azure-portal.md)
azure-arc Create Complete Managed Instance Directly Connected https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/create-complete-managed-instance-directly-connected.md
The next step is to create the data controller in directly connected mode via th
:::image type="content" source="media/create-complete-managed-instance-directly-connected/custom-location.png" alt-text="Create a new custom location and specify a namespace."::: 1. For **Kubernetes configuration template**, specify *azure-arc-aks-premium-storage* because this example uses an AKS cluster.
-1. Set a user name and password for the metrics and log services.
+2. For **Service type**, select **Load balancer**.
+3. Set a user name and password for the metrics and log services.
The passwords must be at least eight characters long and contain characters from three of the following four categories: Latin uppercase letters, Latin lowercase letters, numbers, and non-alphanumeric characters.
azure-arc Delete Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/delete-managed-instance.md
Name Replicas ServerEndpoint State
demo-mi 1/1 10.240.0.4:32023 Ready ```
-## Delete a Azure Arc-enabled SQL Managed Instance
+## Delete Azure Arc-enabled SQL Managed Instance
+ To delete a SQL Managed Instance, run the following command: ```azurecli
Deleted demo-mi from namespace arc
## Reclaim the Kubernetes Persistent Volume Claims (PVCs)
-Deleting a SQL Managed Instance does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will out of disk space. To reclaim the PVCs, take the following steps:
+A PersistentVolumeClaim (PVC) is a request for storage by a user from Kubernetes cluster while creating and adding storage to a SQL Managed Instance. Deleting a SQL Managed Instance does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will run out of disk space or usage of the same SQL Managed Instance name while creating new instance might cause inconsistencies. To reclaim the PVCs, take the following steps:
### 1. List the PVCs for the server group you deleted+ To list the PVCs, run the following command: ```console kubectl get pvc ```
-In the follow example below, notice the PVCs for the SQL Managed Instances you deleted.
+In the example below, notice the PVCs for the SQL Managed Instances you deleted.
+ ```console # kubectl get pvc -n arc
azure-arc Delete Postgresql Hyperscale Server Group https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/delete-postgresql-hyperscale-server-group.md
az postgres arc-server delete -n postgres01 --k8s-namespace <namespace> --use-k8
## Reclaim the Kubernetes Persistent Volume Claims (PVCs)
-Deleting a server group does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will think it's running out of disk space.
+A PersistentVolumeClaim (PVC) is a request for storage by a user from Kubernetes cluster while creating and adding storage to a PostgreSQL Hyperscale server group. Deleting a server group does not remove its associated [PVCs](https://kubernetes.io/docs/concepts/storage/persistent-volumes/). This is by design. The intention is to help the user to access the database files in case the deletion of instance was accidental. Deleting PVCs is not mandatory. However it is recommended. If you don't reclaim these PVCs, you'll eventually end up with errors as your Kubernetes cluster will think it's running out of disk space or usage of the same PostgreSQL Hyperscale server group name while creating new one might cause inconsistencies.
To reclaim the PVCs, take the following steps: ### 1. List the PVCs for the server group you deleted
azure-arc Monitor Grafana Kibana https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/monitor-grafana-kibana.md
# View logs and metrics using Kibana and Grafana
-Kibana and Grafana web dashboards are provided to bring insight and clarity to the Kubernetes namespaces being used by Azure Arc-enabled data services.
+Kibana and Grafana web dashboards are provided to bring insight and clarity to the Kubernetes namespaces being used by Azure Arc-enabled data services. To access Kibana and Grafana web dashboards view service endpoints check [Azure Data Studio dashboards](/azure/azure-arc/data/azure-data-studio-dashboards) documentation.
+
azure-arc Upgrade Sql Managed Instance Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/upgrade-sql-managed-instance-cli.md
Title: Upgrade an an indirectly connected Azure Arc-enabled Managed Instance using the CLI
+ Title: Upgrade an indirectly connected Azure Arc-enabled Managed Instance using the CLI
description: Article describes how to upgrade an indirectly connected Azure Arc-enabled Managed Instance using the CLI
Preparing to upgrade sql sqlmi-1 in namespace arc to data controller version.
### General Purpose
-During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency.
+During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency and [Retry Guidance for Azure Services](/azure/architecture/best-practices/retry-service-specific#sql-database-using-adonet).
To upgrade the Managed Instance, use the following command:
azure-arc Upgrade Sql Managed Instance Direct Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/data/upgrade-sql-managed-instance-direct-cli.md
Preparing to upgrade sql sqlmi-1 in namespace arc to data controller version.
### General Purpose
-During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency.
+During a SQL Managed Instance General Purpose upgrade, the containers in the pod will be upgraded and will be reprovisioned. This will cause a short amount of downtime as the new pod is created. You will need to build resiliency into your application, such as connection retry logic, to ensure minimal disruption. Read [Overview of the reliability pillar](/azure/architecture/framework/resiliency/overview) for more information on architecting resiliency and [retry guidance for Azure Services](/azure/architecture/best-practices/retry-service-specific#sql-database-using-adonet).
To upgrade the Managed Instance, use the following command:
azure-functions Create First Function Vs Code Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/create-first-function-vs-code-python.md
Before you get started, make sure you have the following requirements in place:
+ The [Azure Functions Core Tools](functions-run-local.md#install-the-azure-functions-core-tools) version 3.x.
-+ [Python versions that are supported by Azure Functions](supported-languages.md#languages-by-runtime-version)
++ [Python versions that are supported by Azure Functions](supported-languages.md#languages-by-runtime-version). For more information, see [How to install Python](https://wiki.python.org/moin/BeginnersGuide/Download). + [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
azure-functions Functions How To Azure Devops https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-how-to-azure-devops.md
You can use the following sample to create a YAML file to build a .NET app:
```yaml pool:
- vmImage: 'windows-2019'
+ vmImage: 'windows-latest'
steps: - script: | dotnet restore
You can use the following sample to create a YAML file to build a JavaScript app
```yaml pool:
- vmImage: ubuntu-latest # Use 'windows-2019' if you have Windows native +Node modules
+ vmImage: ubuntu-latest # Use 'windows-latest' if you have Windows native +Node modules
steps: - bash: | if [ -f extensions.csproj ]
You can use the following sample to create a YAML file to package a PowerShell a
```yaml pool:
- vmImage: 'windows-2019'
+ vmImage: 'windows-latest'
steps: - task: ArchiveFiles@2 displayName: "Archive files"
azure-functions Functions Identity Access Azure Sql With Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-identity-access-azure-sql-with-managed-identity.md
# Tutorial: Connect a function app to Azure SQL with managed identity and SQL bindings
-Azure Functions provides a [managed identity](/azure/active-directory/managed-identities-azure-resources/overview.md), which is a turn-key solution for securing access to [Azure SQL Database](/azure/sql-database/) and other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizes [Azure SQL bindings](/azure/azure-functions/functions-bindings-azure-sql). A sample Azure Function project with SQL bindings is available in the [ToDo backend example](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/)).
+Azure Functions provides a [managed identity](/azure/active-directory/managed-identities-azure-resources/overview.md), which is a turn-key solution for securing access to [Azure SQL Database](/azure/sql-database/) and other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizes [Azure SQL bindings](/azure/azure-functions/functions-bindings-azure-sql). A sample Azure Function project with SQL bindings is available in the [ToDo backend example](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/).
When you're finished with this tutorial, your Azure Function will connect to Azure SQL database without the need of username and password.
azure-functions Functions Reference Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-reference-python.md
In this function, the value of the `name` query parameter is obtained from the `
Likewise, you can set the `status_code` and `headers` for the response message in the returned [HttpResponse] object.
+## Web frameworks
+
+You can leverage WSGI and ASGI-compatible frameworks such as Flask and FastAPI with your HTTP-triggered Python functions. This section shows how to modify your functions to support these frameworks.
+
+First, the function.json file must be updated to include a `route` in the HTTP trigger, as shown in the following example:
+
+```json
+{
+ "scriptFile": "__init__.py",
+ "bindings": [
+ {
+ "route": "test",
+ "authLevel": "anonymous",
+ "type": "httpTrigger",
+ "direction": "in",
+ "name": "req",
+ "methods": [
+ "get",
+ "post"
+ ]
+ },
+ {
+ "type": "http",
+ "direction": "out",
+ "name": "$return"
+ }
+ ]
+}
+```
+
+The host.json file must also be updated to include an HTTP `routePrefix`, as shown in the following example.
+
+```json
+{
+ "version": "2.0",
+ "logging": {
+ "applicationInsights": {
+ "samplingSettings": {
+ "isEnabled": true,
+ "excludedTypes": "Request"
+ }
+ },
+ "extensions": { "http": { "routePrefix": "" }}
+ },
+ "extensionBundle": {
+ "id": "Microsoft.Azure.Functions.ExtensionBundle",
+ "version": "[2.*, 3.0.0)"
+ }
+}
+```
+
+Update the Python code file `init.py`, depending on the interface used by your framework. The following example shows either an ASGI hander approach or a WSGI wrapper approach for Flask:
+
+# [ASGI](#tab/asgi)
+
+```python
+app=Flask("Test")
+
+@app.route("/api/HandleApproach")
+def test():
+ return "Hello!"
+
+def main(req: func.HttpRequest, context) -> func.HttpResponse:
+ logging.info('Python HTTP trigger function processed a request.')
+ return func.AsgiMiddleware(app).handle(req, context)
+```
+
+# [WSGI](#tab/wsgi)
+
+```python
+app=Flask("Test")
+
+@app.route("/api/WrapperApproach")
+def test():
+ return "Hello!"
+
+def main(req: func.HttpRequest, context) -> func.HttpResponse:
+ logging.info('Python HTTP trigger function processed a request.')
+ return func.WsgiMiddleware(app).handle(req, context)
+```
++++ ## Scaling and Performance For scaling and performance best practices for Python function apps, see the [Python scale and performance article](python-scale-performance-reference.md).
An extension that inherits from [FuncExtensionBase](https://github.com/Azure/azu
CORS is fully supported for Python function apps.
+## Async
+
+By default, a host instance for Python can process only one function invocation at a time. This is because Python is a single-threaded runtime. For a function app that processes a large number of I/O events or is being I/O bound, you can significantly improve performance by running functions asynchronously. For more information, see [Improve throughout performance of Python apps in Azure Functions](python-scale-performance-reference.md#async).
+ ## <a name="shared-memory"></a>Shared memory (preview) To improve throughput, Functions lets your out-of-process Python language worker share memory with the Functions host process. When your function app is hitting bottlenecks, you can enable shared memory by adding an application setting named [FUNCTIONS_WORKER_SHARED_MEMORY_DATA_TRANSFER_ENABLED](functions-app-settings.md#functions_worker_shared_memory_data_transfer_enabled) with a value of `1`. With shared memory enabled, you can then use the [DOCKER_SHM_SIZE](functions-app-settings.md#docker_shm_size) setting to set the shared memory to something like `268435456`, which is equivalent to 256 MB.
azure-monitor Alerts Log https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/alerts/alerts-log.md
You can also [create log alert rules using Azure Resource Manager templates](../
:::image type="content" source="media/alerts-log/alerts-rule-details-tab.png" alt-text="Details tab.":::
+> [!NOTE]
+> If you, or your administrator assigned the Azure Policy **Azure Log Search Alerts over Log Analytics workspaces should use customer-managed keys**, you must select **Check workspace linked storage** option in **Advanced options**, or the rule creation will fail as it will not meet the policy requirements.
+ 1. In the **Tags** tab, set any required tags on the alert rule resource. :::image type="content" source="media/alerts-log/alerts-rule-tags-tab.png" alt-text="Tags tab.":::
On success for creation, 201 is returned. On success for update, 200 is returned
* Learn about [log alerts](./alerts-unified-log.md). * Create log alerts using [Azure Resource Manager Templates](./alerts-log-create-templates.md). * Understand [webhook actions for log alerts](./alerts-log-webhook.md).
-* Learn more about [log queries](../logs/log-query-overview.md).
+* Learn more about [log queries](../logs/log-query-overview.md).
azure-monitor Profiler https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/profiler.md
Currently the only regions that require endpoint modifications are [Azure Govern
|ApplicationInsightsProfilerEndpoint | `https://profiler.monitor.azure.us` | `https://profiler.monitor.azure.cn` | |ApplicationInsightsEndpoint | `https://dc.applicationinsights.us` | `https://dc.applicationinsights.azure.cn` |
+## Enable Azure Active Directory authentication for profile ingestion
+
+Application Insights Profiler supports Azure AD authentication for profiles ingestion. This means, for all profiles of your application to be ingested, your application must be authenticated and provide the required application settings to the Profiler agent.
+
+As of today, Profiler only supports Azure AD authentication when you reference and configure Azure AD using the Application Insights SDK in your application.
+
+Below you can find all the steps required to enable Azure AD for profiles ingestion:
+1. Create and add the managed identity you want to use to authenticate against your Application Insights resource to your App Service.
+
+ a. For System-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-system-assigned-identity)
+
+ b. For User-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-user-assigned-identity)
+
+2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](https://docs.microsoft.com/azure/azure-monitor/app/azure-ad-authentication?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
+3. Add the following application setting, used to let Profiler agent know which managed identity to use:
+
+For System-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD |
+
+For User-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD;ClientId={Client id of the User-Assigned Identity} |
+ ## Disable Profiler To stop or restart Profiler for an individual app's instance, on the left sidebar, select **WebJobs** and stop the webjob named `ApplicationInsightsProfiler3`.
azure-monitor Snapshot Debugger Appservice https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/snapshot-debugger-appservice.md
Currently the only regions that require endpoint modifications are [Azure Govern
For more information about other connection overrides, see [Application Insights documentation](./sdk-connection-string.md?tabs=net#connection-string-with-explicit-endpoint-overrides).
+## Enable Azure Active Directory authentication for snapshot ingestion
+
+Application Insights Snapshot Debugger supports Azure AD authentication for snapshot ingestion. This means, for all snapshots of your application to be ingested, your application must be authenticated and provide the required application settings to the Snapshot Debugger agent.
+
+As of today, Snapshot Debugger only supports Azure AD authentication when you reference and configure Azure AD using the Application Insights SDK in your application.
+
+Below you can find all the steps required to enable Azure AD for profiles ingestion:
+1. Create and add the managed identity you want to use to authenticate against your Application Insights resource to your App Service.
+
+ a. For System-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-system-assigned-identity)
+
+ b. For User-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-user-assigned-identity)
+
+2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](https://docs.microsoft.com/azure/azure-monitor/app/azure-ad-authentication?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
+3. Add the following application setting, used to let Snapshot Debugger agent know which managed identity to use:
+
+For System-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD |
+
+For User-Assigned Identity:
+
+|App Setting | Value |
+||-|
+|APPLICATIONINSIGHTS_AUTHENTICATION_STRING | Authentication=AAD;ClientId={Client id of the User-Assigned Identity} |
+ ## Disable Snapshot Debugger Follow the same steps as for **Enable Snapshot Debugger**, but switch both switches for Snapshot Debugger to **Off**.
azure-monitor Activity Log https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/essentials/activity-log.md
# Azure Activity log
-The Activity log is a [platform log](./platform-logs-overview.md) in Azure that provides insight into subscription-level events. This includes such information as when a resource is modified or when a virtual machine is started. You can view the Activity log in the Azure portal or retrieve entries with PowerShell and CLI. This article provides details on viewing the Activity log and sending it to different destinations.
+The Activity log is a [platform log](./platform-logs-overview.md) in Azure that provides insight into subscription-level events. Activity log includes such information as when a resource is modified or when a virtual machine is started. You can view the Activity sign in the Azure portal or retrieve entries with PowerShell and CLI. This article provides details on viewing the Activity log and sending it to different destinations.
-For additional functionality, you should create a diagnostic setting to send the Activity log to one or more of these locations for the following reasons:
+For more functionality, you should create a diagnostic setting to send the Activity log to one or more of these locations for the following reasons:
- to [Azure Monitor Logs](../logs/data-platform-logs.md) for more complex querying and alerting, and longer retention (up to 2 years) - to Azure Event Hubs to forward outside of Azure - to Azure Storage for cheaper, long-term archiving
See [Create diagnostic settings to send platform logs and metrics to different d
## Retention Period
-Activity log events are retained in Azure for **90 days** and then deleted. There is no charge for entries during this time regardless of volume. For additional functionality such as longer retention, you should create a diagnostic setting and route the entires to another location based on your needs. See the criteria in the earlier section of this article.
+Activity log events are retained in Azure for **90 days** and then deleted. There is no charge for entries during this time regardless of volume. For more functionality such as longer retention, you should create a diagnostic setting and route the entires to another location based on your needs. See the criteria in the earlier section of this article.
## View the Activity log
-You can access the Activity log from most menus in the Azure portal. The menu that you open it from determines its initial filter. If you open it from the **Monitor** menu, then the only filter will be on the subscription. If you open it from a resource's menu, then the filter will be set to that resource. You can always change the filter though to view all other entries. Click **Add Filter** to add additional properties to the filter.
+You can access the Activity log from most menus in the Azure portal. The menu that you open it from determines its initial filter. If you open it from the **Monitor** menu, then the only filter will be on the subscription. If you open it from a resource's menu, then the filter is set to that resource. You can always change the filter though to view all other entries. Select **Add Filter** to add more properties to the filter.
![View Activity Log](./media/activity-log/view-activity-log.png)
For some events, you can view the Change history, which shows what changes happe
![Change history list for an event](media/activity-log/change-history-event.png)
-If there are any associated changes with the event, you'll see a list of changes that you can select. This opens up the **Change history (Preview)** page. On this page, you see the changes to the resource. In the following example, you can see not only that the VM changed sizes, but what the previous VM size was before the change and what it was changed to. To learn more about change history, see [Get resource changes](../../governance/resource-graph/how-to/get-resource-changes.md).
+If there are any associated changes with the event, you will see a list of changes that you can select. This opens up the **Change history (Preview)** page. On this page, you see the changes to the resource. In the following example, you can see not only that the VM changed sizes, but what the previous VM size was before the change and what it was changed to. To learn more about change history, see [Get resource changes](../../governance/resource-graph/how-to/get-resource-changes.md).
![Change history page showing differences](media/activity-log/change-history-event-details.png) ### Other methods to retrieve Activity log events
-You can also access Activity log events using the following methods.
+You can also access Activity log events using the following methods:
- Use the [Get-AzLog](/powershell/module/az.monitor/get-azlog) cmdlet to retrieve the Activity Log from PowerShell. See [Azure Monitor PowerShell samples](../powershell-samples.md#retrieve-activity-log). - Use [az monitor activity-log](/cli/azure/monitor/activity-log) to retrieve the Activity Log from CLI. See [Azure Monitor CLI samples](../cli-samples.md#view-activity-log).
You can also access Activity log events using the following methods.
- No data ingestion charges for Activity log data stored in a Log Analytics workspace. - No data retention charges for the first 90 days for Activity log data stored in a Log Analytics workspace.
+ Select **Export Activity Logs**.
-[Create a diagnostic setting](./diagnostic-settings.md) to send the Activity log to a Log Analytics workspace. You can send the Activity log from any single subscription to up to five workspaces.
+ ![Export activity logs](media/activity-log/diagnostic-settings-export.png)
+
+to send the Activity log to a Log Analytics workspace. You can send the Activity log from any single subscription to up to five workspaces.
Activity log data in a Log Analytics workspace is stored in a table called *AzureActivity* that you can retrieve with a [log query](../logs/log-query-overview.md) in [Log Analytics](../logs/log-analytics-tutorial.md). The structure of this table varies depending on the [category of the log entry](activity-log-schema.md). For a description of the table properties, see the [Azure Monitor data reference](/azure/azure-monitor/reference/tables/azureactivity).
-For example, to view a count of Activity log records for each category, use the following query.
+For example, to view a count of Activity log records for each category, use the following query:
```kusto AzureActivity | summarize count() by CategoryValue ```
-To retrieve all records in the administrative category, use the following query.
+To retrieve all records in the administrative category, use the following query:
```kusto AzureActivity
Following is sample output data from Event Hubs for an Activity log:
``` ## Send to Azure storage
-Send the Activity Log to an Azure Storage Account if you want to retain your log data longer than 90 days for audit, static analysis, or backup. If you only need to retain your events for 90 days or less you do not need to set up archival to a Storage Account, since Activity Log events are retained in the Azure platform for 90 days.
+Send the Activity Log to an Azure Storage Account if you want to retain your log data longer than 90 days for audit, static analysis, or backup. If you only must retain your events for 90 days or less you do not need to set up archival to a Storage Account, since Activity Log events are retained in the Azure platform for 90 days.
When you send the Activity log to Azure, a storage container is created in the Storage Account as soon as an event occurs. The blobs in the container use the following naming convention:
This section describes legacy methods for collecting the Activity log that were
Log profiles are the legacy method for sending the Activity log to Azure storage or Event Hubs. Use the following procedure to continue working with a log profile or to disable it in preparation for migrating to a diagnostic setting. 1. From the **Azure Monitor** menu in the Azure portal, select **Activity log**.
-3. Click **Diagnostic settings**.
+3. Select **Export Activity Logs**.
- ![Diagnostic settings](media/activity-log/diagnostic-settings.png)
+ ![Export activity logs](media/activity-log/diagnostic-settings-export.png)
-4. Click the purple banner for the legacy experience.
+4. Select the purple banner for the legacy experience.
![Legacy experience](media/activity-log/legacy-experience.png) ### Configure log profile using PowerShell
-If a log profile already exists, you first need to remove the existing log profile and then create a new one.
+If a log profile already exists, you first must remove the existing log profile and then create new one.
1. Use `Get-AzLogProfile` to identify if a log profile exists. If a log profile does exist, note the *name* property.
If a log profile already exists, you first need to remove the existing log profi
| StorageAccountId |No |Resource ID of the Storage Account where the Activity Log should be saved. | | serviceBusRuleId |No |Service Bus Rule ID for the Service Bus namespace you would like to have Event Hubs created in. This is a string with the format: `{service bus resource ID}/authorizationrules/{key name}`. | | Location |Yes |Comma-separated list of regions for which you would like to collect Activity Log events. |
- | RetentionInDays |Yes |Number of days for which events should be retained in the Storage Account, between 1 and 365. A value of zero stores the logs indefinitely. |
+ | RetentionInDays |Yes |Number of days for which events should be retained in the Storage Account, from 1 through 365. A value of zero stores the logs indefinitely. |
| Category |No |Comma-separated list of event categories that should be collected. Possible values are _Write_, _Delete_, and _Action_. | ### Example script
Following is a sample PowerShell script to create a log profile that writes the
### Configure log profile using Azure CLI
-If a log profile already exists, you first need to remove the existing log profile and then create a new log profile.
+If a log profile already exists, you first must remove the existing log profile and then create a log profile.
1. Use `az monitor log-profiles list` to identify if a log profile exists. 2. Use `az monitor log-profiles delete --name "<log profile name>` to remove the log profile using the value from the *name* property.
-3. Use `az monitor log-profiles create` to create a new log profile:
+3. Use `az monitor log-profiles create` to create a log profile:
```azurecli-interactive az monitor log-profiles create --name "default" --location null --locations "global" "eastus" "westus" --categories "Delete" "Write" "Action" --enabled false --days 0 --service-bus-rule-id "/subscriptions/<YOUR SUBSCRIPTION ID>/resourceGroups/<RESOURCE GROUP NAME>/providers/Microsoft.EventHub/namespaces/<Event Hub NAME SPACE>/authorizationrules/RootManageSharedAccessKey"
If a log profile already exists, you first need to remove the existing log profi
| name |Yes |Name of your log profile. | | storage-account-id |Yes |Resource ID of the Storage Account to which Activity Logs should be saved. | | locations |Yes |Space-separated list of regions for which you would like to collect Activity Log events. You can view a list of all regions for your subscription using `az account list-locations --query [].name`. |
- | days |Yes |Number of days for which events should be retained, between 1 and 365. A value of zero will store the logs indefinitely (forever). If zero, then the enabled parameter should be set to false. |
+ | days |Yes |Number of days for which events should be retained, from 1 through 365. A value of zero will store the logs indefinitely (forever). If zero, then the enabled parameter should be set to false. |
|enabled | Yes |True or False. Used to enable or disable the retention policy. If True, then the days parameter must be a value greater than 0. | categories |Yes |Space-separated list of event categories that should be collected. Possible values are Write, Delete, and Action. | ### Log Analytics workspace
-The legacy method for sending the Activity log into a Log Analytics workspace is connecting the log in the workspace configuration.
+The legacy method for sending the Activity log into a Log Analytics workspace is connecting the sign in the workspace configuration.
1. From the **Log Analytics workspaces** menu in the Azure portal, select the workspace to collect the Activity Log. 1. In the **Workspace Data Sources** section of the workspace's menu, select **Azure Activity log**.
-1. Click the subscription you want to connect.
+1. Select the subscription that you want to connect.
![Screenshot shows Log Analytics workspace with an Azure Activity log selected.](media/activity-log/workspaces.png)
-2. Click **Connect** to connect the Activity log in the subscription to the selected workspace. If the subscription is already connected to another workspace, click **Disconnect** first to disconnect it.
+2. Select **Connect** to connect the Activity sign in the subscription to the selected workspace. If the subscription is already connected to another workspace, select **Disconnect** first to disconnect it.
![Connect Workspaces](media/activity-log/connect-workspace.png)
-To disable the setting, perform the same procedure and click **Disconnect** to remove the subscription from the workspace.
+To disable the setting, perform the same procedure and select **Disconnect** to remove the subscription from the workspace.
### Data structure changes
-Diagnostic settings send the same data as the legacy method used to send the Activity log with some changes to the structure of the *AzureActivity* table.
+The Export activity logs experience, sends the same data as the legacy method used to send the Activity log with some changes to the structure of the *AzureActivity* table.
-The columns in the following table have been deprecated in the updated schema. They still exist in *AzureActivity* but they will have no data. The replacements for these columns are not new, but they contain the same data as the deprecated column. They are in a different format, so you may need to modify log queries that use them.
+The columns in the following table have been deprecated in the updated schema. They still exist in *AzureActivity* but they have no data. The replacements for these columns are not new, but they contain the same data as the deprecated column. They are in a different format, so you might need to modify log queries that use them.
|Activity Log JSON | Log Analytics column name<br/>*(older deprecated)* | New Log Analytics column name | Notes | |:|:|:|:|
The columns in the following table have been deprecated in the updated schema. T
> [!Important] > In some cases, the values in these columns may be in all uppercase. If you have a query that includes these columns, you should use the [=~ operator](/azure/kusto/query/datatypes-string-operators) to do a case insensitive comparison.
-The following column have been added to *AzureActivity* in the updated schema:
+The following columns have been added to *AzureActivity* in the updated schema:
- Authorization_d - Claims_d
Monitoring solutions are accessed from the **Monitor** menu in the Azure portal.
![Azure Activity Logs tile](media/activity-log/azure-activity-logs-tile.png)
-Click the **Azure Activity Logs** tile to open the **Azure Activity Logs** view. The view includes the visualization parts in the following table. Each part lists up to 10 items matching that part's criteria for the specified time range. You can run a log query that returns all matching records by clicking **See all** at the bottom of the part.
+Select the **Azure Activity Logs** tile to open the **Azure Activity Logs** view. The view includes the visualization parts in the table. Each part lists up to 10 items that matches that part's criteria for the specified time range. You can run a log query that returns all matching records by clicking **See all** at the bottom of the part.
![Azure Activity Logs dashboard](media/activity-log/activity-log-dash.png) ### Enable the solution for new subscriptions > [!NOTE]
->You will soon no longer be able to add the Activity Logs Analytics solution to your subscription using the Azure portal. You can add it using the following procedure with a Resource Manager template.
+>You will soon no longer be able to add the Activity Logs Analytics solution to your subscription with the Azure portal. You can add it using the following procedure with a Resource Manager template.
1. Copy the following json into a file called *ActivityLogTemplate*.json.
Click the **Azure Activity Logs** tile to open the **Azure Activity Logs** view.
## Next steps- * [Read an overview of platform logs](./platform-logs-overview.md) * [Review Activity log event schema](activity-log-schema.md)
-* [Create diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
+* [Create diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
azure-monitor Alert Management Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/insights/alert-management-solution.md
Title: Alert Management solution in Azure Log Analytics | Microsoft Docs description: The Alert Management solution in Log Analytics helps you analyze all of the alerts in your environment. In addition to consolidating alerts generated within Log Analytics, it imports alerts from connected System Center Operations Manager management groups into Log Analytics. -- Previously updated : 01/19/2018++ Last updated : 01/02/2022
Last updated 01/19/2018
![Alert Management icon](media/alert-management-solution/icon.png)
+> [!CAUTION]
+> This solution is no longer in active development and may not work as expected. We suggest you try using [Azure Resource Graph to query Azure Monitor alerts](../alerts/alerts-overview.md#manage-your-alert-instances-programmatically).
+ The Alert Management solution helps you analyze all of the alerts in your Log Analytics repository. These alerts may have come from a variety of sources including those sources [created by Log Analytics](../alerts/alerts-overview.md) or [imported from Nagios or Zabbix](../vm/monitor-virtual-machine.md). The solution also imports alerts from any [connected System Center Operations Manager management groups](../agents/om-agents.md). ## Prerequisites
The following table provides sample log searches for alert records collected by
## Next steps
-* Learn about [Alerts in Log Analytics](../alerts/alerts-overview.md) for details on generating alerts from Log Analytics.
+* Learn about [Alerts in Log Analytics](../alerts/alerts-overview.md) for details on generating alerts from Log Analytics.
azure-monitor Monitor Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/monitor-reference.md
description: Reference of all services and other resources monitored by Azure Mo
Previously updated : 11/02/2021 Last updated : 11/10/2021
The table below lists the available curated visualizations and more detailed inf
|Name with docs link| State | [Azure portal Link](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/more)| Description | |:--|:--|:--|:--|
-| [Azure Monitor Workbooks for Azure Active Directory](../active-directory/reports-monitoring/howto-use-azure-monitor-workbooks.md) | GA (General availability) | [Yes](https://ms.portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
-| [Azure Backup Insights](../backup/backup-azure-monitoring-use-azuremonitor.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
-| [Azure Monitor for Azure Cache for Redis (preview)](./insights/redis-cache-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
-| [Azure Cosmos DB Insights](./insights/cosmosdb-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
-| [Azure Data Explorer insights](./insights/data-explorer.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. |
-| [Azure HDInsight (preview)](../hdinsight/log-analytics-migration.md#insights) | Preview | No | An Azure Monitor workbook that collects important performance metrics from your HDInsight cluster and provides the visualizations and dashboards for most common scenarios. Gives a complete view of a single HDInsight cluster including resource utilization and application status|
- | [Azure IoT Edge Insights](../iot-edge/how-to-explore-curated-visualizations.md) | GA | No | Visualize and explore metrics collected from the IoT Edge device right in the Azure portal using Azure Monitor Workbooks based public templates. The curated workbooks use built-in metrics from the IoT Edge runtime. These views don't need any metrics instrumentation from the workload modules. |
- | [Azure Key Vault Insights (preview)](./insights/key-vault-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/keyvaultsInsights) | Provides comprehensive monitoring of your key vaults by delivering a unified view of your Key Vault requests, performance, failures, and latency. |
- | [Azure Monitor Application Insights](./app/app-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/applicationsInsights) | Extensible Application Performance Management (APM) service which monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations. It enables you to diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes. |
- | [Azure Monitor Log Analytics Workspace](./logs/log-analytics-workspace-insights-overview.md) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/lawsInsights) | Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview). |
- | [Azure Service Bus](/azure/service-bus/) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | |
- | [Azure SQL insights](./insights/sql-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/sqlWorkloadInsights) | A comprehensive interface for monitoring any product in the Azure SQL family. SQL insights uses dynamic management views to expose the data you need to monitor health, diagnose problems, and tune performance. Note: If you are just setting up SQL monitoring, use this instead of the SQL Analytics solution. |
+| [Azure Monitor Workbooks for Azure Active Directory](/azure/active-directory/reports-monitoring/howto-use-azure-monitor-workbooks) | GA (General availability) | [Yes](https://ms.portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
+| [Azure Backup](/azure/backup/backup-azure-monitoring-use-azuremonitor) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
+| [Azure Monitor for Azure Cache for Redis (preview)](/azure/azure-monitor/insights/redis-cache-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
+| [Azure Cosmos DB Insights](/azure/azure-monitor/insights/cosmosdb-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
+| [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. |
+| [Azure Data Explorer insights](/azure/azure-monitor/insights/data-explorer) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. |
+| [Azure HDInsight (preview)](/azure/hdinsight/log-analytics-migration#insights) | Preview | No | An Azure Monitor workbook that collects important performance metrics from your HDInsight cluster and provides the visualizations and dashboards for most common scenarios. Gives a complete view of a single HDInsight cluster including resource utilization and application status|
+ | [Azure IoT Edge](/azure/iot-edge/how-to-explore-curated-visualizations/) | GA | No | Visualize and explore metrics collected from the IoT Edge device right in the Azure portal using Azure Monitor Workbooks based public templates. The curated workbooks use built-in metrics from the IoT Edge runtime. These views don't need any metrics instrumentation from the workload modules. |
+ | [Azure Key Vault Insights (preview)](/azure/azure-monitor/insights/key-vault-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/keyvaultsInsights) | Provides comprehensive monitoring of your key vaults by delivering a unified view of your Key Vault requests, performance, failures, and latency. |
+ | [Azure Monitor Application Insights](/azure/azure-monitor/app/app-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/applicationsInsights) | Extensible Application Performance Management (APM) service which monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations. It enables you to diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes. |
+ | [Azure Monitor Log Analytics Workspace](/azure/azure-monitor/logs/log-analytics-workspace-insights-overview) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/lawsInsights) | Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview). |
+ | [Azure Service Bus Insights](/azure/service-bus-messaging/monitor-service-bus) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | |
+ | [Azure SQL insights](/azure/azure-monitor/insights/sql-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/sqlWorkloadInsights) | A comprehensive interface for monitoring any product in the Azure SQL family. SQL insights uses dynamic management views to expose the data you need to monitor health, diagnose problems, and tune performance. Note: If you are just setting up SQL monitoring, use this instead of the SQL Analytics solution. |
| [Azure Storage Insights](/azure/azure-monitor/insights/storage-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/storageInsights) | Provides comprehensive monitoring of your Azure Storage accounts by delivering a unified view of your Azure Storage services performance, capacity, and availability. |
- | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
- | [Azure Network Insights](./insights/network-insights-overview.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
- | [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. |
- | [Azure Monitor for Resource Groups](./insights/resource-group-insights.md) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
- | [Azure Monitor SAP](../virtual-machines/workloads/sap/monitor-sap-on-azure.md) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
+ | [Azure Network Insights](/azure/azure-monitor/insights/network-insights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
+ | [Azure Monitor for Resource Groups](/azure/azure-monitor/insights/resource-group-insights) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
+ | [Azure Monitor SAP](/azure/virtual-machines/workloads/sap/monitor-sap-on-azure) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
| [Azure Stack HCI insights](/azure-stack/hci/manage/azure-stack-hci-insights) | Preview | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/azureStackHCIInsights) | Azure Monitor Workbook based. Provides health, performance, and usage insights about registered Azure Stack HCI, version 21H2 clusters that are connected to Azure and are enrolled in monitoring. It stores its data in a Log Analytics workspace, which allows it to deliver powerful aggregation and filtering and analyze data trends over time. |
- | [Windows Virtual Desktop Insights](../virtual-desktop/azure-monitor.md) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Monitor for Windows Virtual Desktop (preview) is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Windows Virtual Desktop environments. This topic will walk you through how to set up Azure Monitor for Windows Virtual Desktop to monitor your Windows Virtual Desktop environments. |
+ | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
+ | [Azure Virtual Desktop Insights](/azure/virtual-desktop/azure-monitor) | GA | [Yes](https://ms.portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Virtual Desktop Insights is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Windows Virtual Desktop environments. |
## Product integrations
azure-monitor View Designer Filters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer-filters.md
Title: Filters in Azure Monitor views | Microsoft Docs description: A filter in an Azure Monitor view allows users to filter the data in the view by the value of a particular property without modifying the view itself. This article describes how to use a filter and add one to a custom view. -- Last updated 06/22/2018 # Filters in Azure Monitor views+
+> [!IMPORTANT]
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+ A **filter** in an [Azure Monitor view](view-designer.md) allows users to filter the data in the view by the value of a particular property without modifying the view itself. For example, you could allow users of your view to filter the view for data only from a particular computer or set of computers. You can create multiple filters on a single view to allow users to filter by multiple properties. This article describes how to use a filter and add one to a custom view. ## Using a filter
azure-monitor View Designer Parts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer-parts.md
Title: A reference guide to the View Designer parts in Azure Monitor | Microsoft Docs description: By using View Designer in Azure Monitor, you can create custom views that are displayed in the Azure portal and contain a variety of visualizations on data in the Log Analytics workspace. This article is a reference guide to the settings for the visualization parts that are available in your custom views. -- Last updated 03/12/2018 # Reference guide to View Designer visualization parts in Azure Monitor+
+> [!IMPORTANT]
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+ By using View Designer in Azure Monitor, you can create a variety of custom views in the Azure portal that can help you visualize data in your Log Analytics workspace. This article is a reference guide to the settings for the visualization parts that are available in your custom views. For more information about View Designer, see:
The following table describes the settings for thresholds:
| Color |The color that indicates the threshold value. | ## Next steps
-* Learn about [log queries](../logs/log-query-overview.md) to support the queries in visualization parts.
+* Learn about [log queries](../logs/log-query-overview.md) to support the queries in visualization parts.
azure-monitor View Designer Tiles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer-tiles.md
Title: A reference guide to the View Designer tiles in Azure Monitor | Microsoft Docs description: By using View Designer in Azure Monitor, you can create custom views that are displayed in the Azure portal and contain a variety of visualizations on data in the Log Analytics workspace. This article is a reference guide to the settings for the tiles that are available in your custom views. -- Last updated 01/17/2018 - # Reference guide to View Designer tiles in Azure Monitor+
+> [!IMPORTANT]
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+ By using View Designer in Azure Monitor, you can create a variety of custom views in the Azure portal that can help you visualize data in your Log Analytics workspace. This article is a reference guide to the settings for the tiles that are available in your custom views. For more information about View Designer, see:
The **Two timelines** tile displays the results of two log queries over time as
## Next steps * Learn about [log queries](../logs/log-query-overview.md) to support the queries in tiles.
-* Add [visualization parts](view-designer-parts.md) to your custom view.
+* Add [visualization parts](view-designer-parts.md) to your custom view.
azure-monitor View Designer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/visualize/view-designer.md
Title: Create views to analyze log data in Azure Monitor | Microsoft Docs description: By using View Designer in Azure Monitor, you can create custom views that are displayed in the Azure portal and contain a variety of visualizations on data in the Log Analytics workspace. This article contains an overview of View Designer and presents procedures for creating and editing custom views. --++ Last updated 08/04/2020
Last updated 08/04/2020
By using View Designer in Azure Monitor, you can create a variety of custom views in the Azure portal that can help you visualize data in your Log Analytics workspace. This article presents an overview of View Designer and procedures for creating and editing custom views. > [!IMPORTANT]
-> Views in Azure Monitor have been transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
+> Views in Azure Monitor [are being retired](https://azure.microsoft.com/updates/view-designer-in-azure-monitor-is-retiring-on-31-august-2023/) and transitioned to [workbooks](workbooks-overview.md) which provide additional functionality. See [Azure Monitor view designer to workbooks transition guide](view-designer-conversion-overview.md) for details on converting your existing views to workbooks.
The options for working with views in edit mode are described in the following t
## Next steps * Add [Tiles](view-designer-tiles.md) to your custom view.
-* Add [Visualization parts](view-designer-parts.md) to your custom view.
+* Add [Visualization parts](view-designer-parts.md) to your custom view.
azure-portal How To Create Azure Support Request https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-portal/supportability/how-to-create-azure-support-request.md
Title: How to create an Azure support request
description: Customers who need assistance can use the Azure portal to find self-service solutions and to create and manage support requests. Previously updated : 12/07/2021 Last updated : 02/01/2022 # Create an Azure support request
You can get to **Help + support** in the Azure portal. It's available from the A
### Azure role-based access control
-To create a support request, you must be an [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor) or be assigned to the [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role at the subscription level. To create a support request without a subscription, for example an Azure Active Directory scenario, you must be an [Admin](../../active-directory/roles/permissions-reference.md).
+To create a support request, you must have the [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role, or a custom role with [Microsoft.Support/*](/azure/role-based-access-control/resource-provider-operations#microsoftsupport), at the subscription level.
+
+To create a support request without a subscription, for example an Azure Active Directory scenario, you must be an [Admin](../../active-directory/roles/permissions-reference.md).
> [!IMPORTANT]
-> If a support request requires investigation into multiple subscriptions, you must have Owner, Contributor, or Support Request Contributor role for each subscription involved.
+> If a support request requires investigation into multiple subscriptions, you must have the required access for each subscription involved ([Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Reader](../../role-based-access-control/built-in-roles.md#reader), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor), or a custom role with the [Microsoft.Support/supportTickets/read](/azure/role-based-access-control/resource-provider-operations#microsoftsupport) permission).
### Go to Help + support from the global header
azure-resource-manager Modules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/modules.md
Title: Bicep modules description: Describes how to define a module in a Bicep file, and how to use module scopes. Previously updated : 11/19/2021 Last updated : 02/01/2022 # Bicep modules
Bicep enables you to organize deployments into modules. A module is just a Bicep
To share modules with other people in your organization, create a [template spec](../templates/template-specs.md) or [private registry](private-module-registry.md). Template specs and modules in the registry are only available to users with the correct permissions. > [!TIP]
-> The choice between template specs and private registries is mostly a matter of preference. If you're deploying templates or Bicep files without other project artifacts, template specs are an easier option. If you're deploying project artifacts with the templates or Bicep files, you can integrate the private registry with your development work and then more easily deploy all of it from the registry.
+> The choice between module registry and template specs is mostly a matter of preference. There are a few things to consider when you choose between the two:
+>
+> - Module registry is only supported by Bicep. If you are not yet using Bicep, use template specs.
+> - Content in the Bicep module registry can only be deployed from another Bicep file. Template specs can be deployed directly from the API, Azure PowerShell, Azure CLI, and the Azure portal. You can even use [`UiFormDefinition`](../templates/template-specs-create-portal-forms.md) to customize the portal deployment experience.
+> - Bicep has some limited capabilities for embedding other project artifacts (including non-Bicep and non-ARM-template files. For example, PowerShell scripts, CLI scripts and other binaries) by using the [`loadTextContent`](./bicep-functions-files.md#loadtextcontent) and [`loadFileAsBase64`](./bicep-functions-files.md#loadfileasbase64) functions. Template specs can't package these artifacts.
Bicep modules are converted into a single Azure Resource Manager template with [nested templates](../templates/linked-templates.md#nested-template).
azure-resource-manager Template Specs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/template-specs.md
Title: Create & deploy template specs in Bicep description: Describes how to create template specs in Bicep and share them with other users in your organization. Previously updated : 01/19/2022 Last updated : 02/01/2022 # Azure Resource Manager template specs in Bicep
To deploy the template spec, you use standard Azure tools like PowerShell, Azure
When designing your deployment, always consider the lifecycle of the resources and group the resources that share similar lifecycle into a single template spec. For instance, your deployments include multiple instances of Cosmos DB with each instance containing its own databases and containers. Given the databases and the containers donΓÇÖt change much, you want to create one template spec to include a Cosmo DB instance and its underlying databases and containers. You can then use conditional statements in your Bicep along with copy loops to create multiple instances of these resources.
-The choice between template specs and [private module registries](./private-module-registry.md) is mostly a matter of preference. If you're deploying templates or Bicep files without other project artifacts, template specs are an easier option. If you're deploying project artifacts with the templates or Bicep files, you can integrate the private registry with your development work and then more easily deploy all of it from the registry.
+> [!TIP]
+> The choice between module registry and template specs is mostly a matter of preference. There are a few things to consider when you choose between the two:
+>
+> - Module registry is only supported by Bicep. If you are not yet using Bicep, use template specs.
+> - Content in the Bicep module registry can only be deployed from another Bicep file. Template specs can be deployed directly from the API, Azure PowerShell, Azure CLI, and the Azure portal. You can even use [`UiFormDefinition`](../templates/template-specs-create-portal-forms.md) to customize the portal deployment experience.
+> - Bicep has some limited capabilities for embedding other project artifacts (including non-Bicep and non-ARM-template files. For example, PowerShell scripts, CLI scripts and other binaries) by using the [`loadTextContent`](./bicep-functions-files.md#loadtextcontent) and [`loadFileAsBase64`](./bicep-functions-files.md#loadfileasbase64) functions. Template specs can't package these artifacts.
### Microsoft Learn
azure-resource-manager Deploy Rest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deploy-rest.md
Title: Deploy resources with REST API and template description: Use Azure Resource Manager and Resource Manager REST API to deploy resources to Azure. The resources are defined in a Resource Manager template. Previously updated : 10/22/2020 Last updated : 02/01/2022 # Deploy resources with ARM templates and Azure Resource Manager REST API
The examples in this article use resource group deployments.
GET https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.Resources/deployments/{deploymentName}?api-version=2020-10-01 ```
+## Deploy with ARMClient
+
+ARMClient is a simple command line tool to invoke the Azure Resource Manager API. To install the tool, see [ARMClient](https://github.com/projectkudu/ARMClient).
+
+To list your subscriptions:
+
+```cmd
+armclient GET /subscriptions?api-version=2021-04-01
+```
+
+To list your resource groups:
+
+```cmd
+armclient GET /subscriptions/<subscription-id>/resourceGroups?api-version=2021-04-01
+```
+
+Replace **&lt;subscription-id>** with your Azure subscription ID.
+
+To create a resource group in the *Central US* region:
+
+```cmd
+armclient PUT /subscriptions/<subscription-id>/resourceGroups/<resource-group-name>?api-version=2021-04-01 "{location: 'central us', properties: {}}"
+```
+
+Alternatively, you can put the body into a JSON file called **CreateRg.json**:
+
+```json
+{
+ "location": "Central US",
+ "properties": { }
+}
+```
+
+```cmd
+armclient PUT /subscriptions/<subscription-id>/resourceGroups/<resource-group-name>?api-version=2021-04-01 '@CreateRg.json'
+```
+
+For more information, see [ARMClient: a command line tool for the Azure API](http://blog.davidebbo.com/2015/01/azure-resource-manager-client.html).
+ ## Deployment name You can give your deployment a name such as `ExampleDeployment`.
azure-sql Connectivity Settings https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/connectivity-settings.md
The connectivity settings are accessible from the **Firewalls and virtual networ
## Deny public network access
-The defaukt for this setting is **No** so that customers can connect by using either public endpoints (with IP-based server- level firewall rules or with virtual-network firewall rules) or private endpoints (by using Azure Private Link), as outlined in the [network access overview](network-access-controls-overview.md).
+The default for this setting is **No** so that customers can connect by using either public endpoints (with IP-based server- level firewall rules or with virtual-network firewall rules) or private endpoints (by using Azure Private Link), as outlined in the [network access overview](network-access-controls-overview.md).
When **Deny public network access** is set to **Yes**, only connections via private endpoints are allowed. All connections via public endpoints will be denied with an error message similar to:
azure-sql Resource Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/resource-limits.md
Support for the premium-series hardware generations (public preview) is currentl
| Australia East | Yes | Yes | | Canada Central | Yes | | | Canada East | Yes | |
+| Central US | Yes | |
+| East US | Yes | |
| East US 2 | Yes | | | France Central | | Yes | | Germany West Central | | Yes |
azure-sql Sql Assessment For Sql Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/sql-assessment-for-sql-vm.md
The SQL Assessment feature of the Azure portal identifies possible performance i
The SQL Assessment feature is currently in preview.
+To learn more, watch this video on [SQL Assessment](/shows/Data-Exposed/?WT.mc_id=dataexposed-c9-niner):
+<iframe src="https://aka.ms/docs/player?id=13b2bf63-485c-4ec2-ab14-a1217734ad9f" width="640" height="370" style="border: 0; max-width: 100%; min-width: 100%;"></iframe>
+++ ## Overview Once the SQL Assessment feature is enabled, your SQL Server instance and databases are scanned to provide recommendations for things like indexes, deprecated features, enabled or missing trace flags, statistics, etc. Recommendations are surfaced to the [SQL VM management page](manage-sql-vm-portal.md) of the [Azure portal](https://portal.azure.com/#blade/HubsExtension/BrowseResource/resourceType/Microsoft.SqlVirtualMachine%2FSqlVirtualMachines).
backup Backup Azure Sap Hana Database Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-sap-hana-database-troubleshoot.md
Title: Troubleshoot SAP HANA databases backup errors description: Describes how to troubleshoot common errors that might occur when you use Azure Backup to back up SAP HANA databases. Previously updated : 05/31/2021 Last updated : 02/01/2022+++ # Troubleshoot backup of SAP HANA databases on Azure
-This article provides troubleshooting information for backing up SAP HANA databases on Azure virtual machines. For more information on the SAP HANA backup scenarios we currently support, see [Scenario support](sap-hana-backup-support-matrix.md#scenario-support).
+This article provides troubleshooting information to back up SAP HANA databases on Azure virtual machines. For more information on the SAP HANA backup scenarios we currently support, see [Scenario support](sap-hana-backup-support-matrix.md#scenario-support).
## Prerequisites and Permissions
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
| **Error message** | `Azure Backup does not have required role privileges to carry out Backup and Restore operations` | | - | |
-| **Possible causes** | All operations will fail with this error when the Backup user (AZUREWLBACKUPHANAUSER) doesn't have the **SAP_INTERNAL_HANA_SUPPORT** role assigned or the role may have been overwritten. |
+| **Possible causes** | All operations fail with this error when the Backup user (AZUREWLBACKUPHANAUSER) doesn't have the **SAP_INTERNAL_HANA_SUPPORT** role assigned or the role might have been overwritten. |
| **Recommended action** | Download and run the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance, or manually assign the **SAP_INTERNAL_HANA_SUPPORT** role to the Backup user (AZUREWLBACKUPHANAUSER).<br><br>**Note**<br><br>If you are using HANA 2.0 SPS04 Rev 46 and later, this error doesn't occur as the use of the **SAP_INTERNAL_HANA_SUPPORT** role is deprecated in these HANA versions. | ### UserErrorInOpeningHanaOdbcConnection | **Error message** | `Failed to connect to HANA system` | | | |
-| **Possible causes** | <ul><li>Connection to HANA instance failed</li><li>System DB is offline</li><li>Tenant DB is offline</li><li>Backup user (AZUREWLBACKUPHANAUSER) doesn't have enough permissions/privileges.</li></ul> |
-| **Recommended action** | Check if the system is up and running. If the database(s) is up and running, ensure that the required permissions are set by downloading and running the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance. |
+| **Possible causes** | <ul><li>Connection to HANA instance failed</li><li>System DB is offline</li><li>Tenant DB is offline</li><li>Backup user (AZUREWLBACKUPHANAUSER) doesn't have enough permissions/privileges.</li></ul> |
+| **Recommended action** | Check if the system is running. If one or more databases is running, ensure that the required permissions are set. To do so, download and run the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance. |
### UserErrorHanaInstanceNameInvalid | **Error message** | `The specified SAP HANA instance is either invalid or can't be found` | | | | | **Possible causes** | <ul><li>The specified SAP HANA instance is either invalid or can't be found.</li><li>Multiple SAP HANA instances on a single Azure VM can't be backed up.</li></ul> |
-| **Recommended action** | <ul><li>Ensure that only one HANA instance is running on the Azure VM.</li><li>Run the script from the Discover DB pane (you can also find this [here](https://aka.ms/scriptforpermsonhana)) with the correct SAP HANA instance to resolve the issue.</li></ul> |
+| **Recommended action** | <ul><li>Ensure that only one HANA instance is running on the Azure VM.</li><li> To resolve the issue, run the script from the _Discover DB_ pane (you can also find the script [here](https://aka.ms/scriptforpermsonhana)) with the correct SAP HANA instance.</li></ul> |
### UserErrorHANALSNValidationFailure | **Error message** | `Backup log chain is broken` | | | |
-| **Possible causes** | HANA LSN Log chain break can be triggered for various reasons, including:<ul><li>Azure Storage call failure to commit backup.</li><li>The Tenant DB is offline.</li><li>Extension upgrade has terminated an in-progress Backup job.</li><li>Unable to connect to Azure Storage during backup.</li><li>SAP HANA has rolled back a transaction in the backup process.</li><li>A backup is complete, but catalog is not yet updated with success in HANA system.</li><li>Backup failed from Azure Backup perspective, but success from HANA's perspective - the log backup/catalog destination may have been updated from backint to file system, or the backint executable may have been changed.</li></ul> |
-| **Recommended action** | To resolve this issue, Azure Backup triggers an auto-heal Full backup. While this auto-heal backup is in progress, all log backups are triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**. Once the auto-heal Full backup is complete, logs and all other backups will start working as expected.<br>If you do not see an auto-heal full backup triggered or any successful backup (Full/Differential/ Incremental) in 24 hours, contact Microsoft support.</br> |
+| **Possible causes** | HANA LSN Log chain break can be triggered for various reasons, including:<ul><li>Azure Storage call failure to commit backup.</li><li>The Tenant DB is offline.</li><li>Extension upgrade has terminated an in-progress Backup job.</li><li>Unable to connect to Azure Storage during backup.</li><li>SAP HANA has rolled back a transaction in the backup process.</li><li>A backup is complete, but catalog is not yet updated with success in HANA system.</li><li>Backup failed from Azure Backup perspective, but success from the perspective of HANA ΓÇö the log backup/catalog destination might have been updated from backint-to-file system, or the backint executable might have been changed.</li></ul> |
+| **Recommended action** | To resolve this issue, Azure Backup triggers an auto-heal Full backup. While this auto-heal backup is in progress, all log backups are triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**. Once the auto-heal Full backup is complete, logs and all other backups start working as expected.<br>If you do not see an auto-heal full backup triggered or any successful backup (Full/Differential/ Incremental) in 24 hours, contact Microsoft support.</br> |
### UserErrorSDCtoMDCUpgradeDetected | **Error message** | `SDC to MDC upgrade detected.` | | | |
-| **Possible causes** | When an SDC system is upgraded to MDC, backups fail with this error. |
+| **Possible causes** | When an SDC system is upgraded to MDC, backups fail with this error. |
| **Recommended action** | To troubleshoot and resolve the issue, see [SDC to MDC upgrade](#sdc-to-mdc-upgrade-with-a-change-in-sid). | ### UserErrorInvalidBackintConfiguration
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
| **Error message** | `Backups will fail with this error when the Backint Configuration is incorrectly updated.` | | | | | **Possible causes** | The backint configuration updated during the Configure Protection flow by Azure Backup is either altered/updated by the customer. |
-| **Recommended action** | Check if the following (backint) parameters are set:<br><ul><li>[catalog_backup_using_backint:true]</li><li>[enable_accumulated_catalog_backup:false]</li><li>[parallel_data_backup_backint_channels:1]</li><li>[log_backup_timeout_s:900)]</li><li>[backint_response_timeout:7200]</li></ul>If backint-based parameters are present at the HOST level, remove them. However, if parameters aren't present at the HOST level but have been manually modified at a database level, ensure that the database level values are set above. Or, run [stop protection with retain backup data](./sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) from the Azure portal, and then select Resume backup. |
+| **Recommended action** | Check if the following (backint) parameters are set:<br><ul><li>[catalog_backup_using_backint:true]</li><li>[enable_accumulated_catalog_backup:false]</li><li>[parallel_data_backup_backint_channels:1]</li><li>[log_backup_timeout_s:900)]</li><li>[backint_response_timeout:7200]</li></ul>If backint-based parameters are present at the HOST level, remove them. However, if the parameters aren't present at the HOST level, but are manually modified at a database level, ensure that the database level values are set. Or, run [stop protection with retain backup data](./sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) from the Azure portal, and then select Resume backup. |
### UserErrorIncompatibleSrcTargetSystemsForRestore |**Error message** | `The source and target systems for restore are incompatible.` | ||| |**Possible causes** | The restore flow fails with this error when the source and target HANA databases, and systems are incompatible. |
-|Recommended action | Ensure that your restore scenario isn't in the following list of possible incompatible restores:<br> **Case 1:** SYSTEMDB cannot be renamed during restore.<br>**Case 2:** Source - SDC and target - MDC: The source database cannot be restored as SYSTEMDB or tenant DB on the target. <br> **Case 3:** Source - MDC and target - SDC: The source database (SYSTEMDB or tenant DB) cannot be restored to the target.<br>To learn more, see the note **1642148** in the [SAP support launchpad](https://launchpad.support.sap.com). |
+|Recommended action | Ensure that your restore scenario isn't in the following list of possible incompatible restores:<br> **Case 1:** SYSTEMDB cannot be renamed during restore.<br>**Case 2:** Source ΓÇö SDC and target ΓÇö MDC: The source database cannot be restored as SYSTEMDB or tenant DB on the target. <br> **Case 3:** Source ΓÇö MDC and target ΓÇö SDC: The source database (SYSTEMDB or tenant DB) cannot be restored to the target.<br>To learn more, see the note **1642148** in the [SAP support launchpad](https://launchpad.support.sap.com). |
### UserErrorHANAPODoesNotExist **Error message** | `Database configured for backup does not exist.` | --
-**Possible causes** | If a database that has been configured for backup is deleted, then all scheduled and ad-hoc backups on this database will fail.
+**Possible causes** | If you delete a database that is configured for backup, all scheduled and on-demand backups on this database will fail.
**Recommended action** | Verify if the database is deleted. Re-create the database or [stop protection](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) (with or without retain data) for the database. ### UserErrorInsufficientPrivilegeOfDatabaseUser **Error message** | `Azure Backup does not have enough privileges to carry out Backup and Restore operations.` - |
-**Possible causes** | Backup user (AZUREWLBACKUPHANAUSER) created by the pre-registration script doesn't have one or more of the following roles assigned:<ul><li>For MDC, DATABASE ADMIN and BACKUP ADMIN (for HANA 2.0 SPS05 and later) to create new databases during restore.</li><li>For SDC, BACKUP ADMIN to create new databases during restore.</li><li>CATALOG READ to read the backup catalog.</li><li>SAP_INTERNAL_HANA_SUPPORT to access a few private tables. Only required for SDC and MDC versions prior to HANA 2.0 SPS04 Rev 46. This is not required for HANA 2.0 SPS04 Rev 46 and later as we are getting the required information from public tables now with the fix from HANA team.</li></ul>
-**Recommended action** | To resolve the issue, add the required roles and permissions manually to the Backup user (AZUREWLBACKUPHANAUSER), or download and run the pre-registration script on the [SAP HANA instance](https://aka.ms/scriptforpermsonhana).
+**Possible causes** | Backup user (AZUREWLBACKUPHANAUSER) created by the pre-registration script doesn't have one or more of the following roles assigned:<ul><li>For MDC, DATABASE ADMIN and BACKUP ADMIN (for HANA 2.0 SPS05 and later) create new databases during restore.</li><li>For SDC, BACKUP ADMIN creates new databases during restore.</li><li>CATALOG READ to read the backup catalog.</li><li>SAP_INTERNAL_HANA_SUPPORT to access a few private tables. Only required for SDC and MDC versions prior to HANA 2.0 SPS04 Rev 46. It's not required for HANA 2.0 SPS04 Rev 46 and later. This is because we are getting the required information from public tables now with the fix from HANA team.</li></ul>
+**Recommended action** | To resolve the issue, add the required roles and permissions manually to the Backup user (AZUREWLBACKUPHANAUSER). Or, you can download and run the pre-registration script on the [SAP HANA instance](https://aka.ms/scriptforpermsonhana).
### UserErrorDatabaseUserPasswordExpired
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
**Error message** | `Remedial Backup in progress.` - | -
-**Possible causes** | Azure Backup triggers a remedial full backup to handle LSN log chain break. While this remedial full is in progress, backups (Full/ Differential/Incremental) triggered through the portal/CLI fails with this error.
-**Recommended action** | Wait for the remedial full backup to complete successfully before triggering another backup.
+**Possible causes** | Azure Backup triggers a remedial full backup to handle LSN log chain break. While the remedial full is in progress, backups (Full/ Differential/Incremental) triggered through the portal/CLI fails with this error.
+**Recommended action** | Wait for the remedial full backup to complete successfully before you trigger another backup.
### OperationCancelledBecauseConflictingOperationRunningUserError **Error message** | `Conflicting operation in progress.` -- | - **Possible causes** | A Full/Differential/Incremental backup triggered through portal/CLI/native HANA clients, while another Full/Differential/Incremental backup is already in progress.
-**Recommended action** | Wait for the active backup job to complete before triggering a new Full/delta backup.
+**Recommended action** | Wait for the active backup job to complete before you trigger a new Full/delta backup.
### OperationCancelledBecauseConflictingAutohealOperationRunning UserError **Error message** | `Auto-heal Full backup in progress.` - | -
-**Possible causes** | Azure Backup triggers an auto-heal Full backup to resolve **UserErrorHANALSNValidationFailure**. While this auto-heal backup is in progress, all the log backups triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**.<br>Once the auto-heal Full backup is complete, logs and all other backups will start working as expected.</br>
-**Recommended action** | Wait for the auto-heal Full backup to complete before triggering a new Full/delta backup.
+**Possible causes** | Azure Backup triggers an auto-heal Full backup to resolve **UserErrorHANALSNValidationFailure**. While this auto-heal backup is in progress, all the log backups triggered by HANA fail with **OperationCancelledBecauseConflictingAutohealOperationRunningUserError**.<br>Once the auto-heal Full backup is complete, logs and all other backups start working as expected.</br>
+**Recommended action** | Wait for the auto-heal Full backup to complete before you trigger a new Full/delta backup.
### UserErrorHanaPreScriptNotRun **Error message** | `Pre-registration script not run.` | --
-**Possible causes** | The SAP HANA pre-registration script for setting up the environment has not been run.
+**Possible causes** | The SAP HANA pre-registration script to set up the environment has not been run.
**Recommended action** | Download and run the [pre-registration script](https://aka.ms/scriptforpermsonhana) on the SAP HANA instance.
Refer to the [prerequisites](tutorial-backup-sap-hana-db.md#prerequisites) and [
**Error message** | `RecoverySys.py could not be run successfully to restore System DB.` -- |
-**Possible causes** | Possible causes for System DB restore to fail are:<ul><li>Azure Backup is unable to find **Recoverysys.py** on the HANA machine. This happens when the HANA environment isn't set up properly.</li><li>**Recoverysys.py** is present, but triggering this script has failed to invoke HANA to perform the restore.</li><li>Recoverysys.py has successfully invoked HANA to perform the restore, but HANA fails to restore.</li></ul>
-**Recommended action** | <ul><li>For issue 1, work with the SAP HANA team to fix the issue.</li><li>For 2 and 3, see the log trace by running the HDSetting.sh command in sid-adm prompt. For example, _/usr/sap/SID/HDB00/HDBSetting.sh_.</li></ul>Share these findings with the SAP HANA team to get the issue fixed.
+**Possible causes** | Possible causes for System DB restore to fail are:<ul><li>Azure Backup is unable to find **Recoverysys.py** on the HANA machine. It happens when the HANA environment isn't set up properly.</li><li>**Recoverysys.py** is present, but when you trigger this script, it fails to invoke HANA to perform the restore.</li><li>Recoverysys.py has successfully invoked HANA to perform the restore, but HANA fails to restore.</li></ul>
+**Recommended action** | <ul><li>For issue 1, work with the SAP HANA team to fix the issue.</li><li>For 2 and 3, run the HDSetting.sh command in sid-adm prompt and see the log trace. For example, _/usr/sap/SID/HDB00/HDBSetting.sh_.</li></ul>Share these findings with the SAP HANA team to get the issue fixed.
### UserErrorDBNameNotInCorrectFormat
Assume an SDC HANA instance "H21" is backed up. The backup items page will show
Note the following points: -- By default, the restored db name will be populated with the backup item name. In this case, h21(sdc).-- Selecting the target as H11 won't change the restored db name automatically. **It should be edited to h11(sdc)**. Regarding SDC, the restored db name will be the target instance ID with lowercase letters and 'sdc' appended in brackets.
+- By default, the restored database name will be populated with the backup item name. In this case, h21(sdc).
+- Select the target as H11 won't change the restored database name automatically. **It should be edited to h11(sdc)**. Regarding SDC, the restored db name will be the target instance ID with lowercase letters and 'sdc' appended in brackets.
- Since SDC can have only single database, you also need to select the checkbox to allow override of the existing database data with the recovery point data. - Linux is case-sensitive. So be careful to preserve the case. ### Multiple Container Database (MDC) restore
-In multiple container databases for HANA, the standard configuration is SYSTEMDB + 1 or more Tenant DBs. Restoring an entire SAP HANA instance means to restore both SYSTEMDB and Tenant DBs. One restores SYSTEMDB first and then proceeds for Tenant DB. System DB essentially means to override the system information on the selected target. This restore also overrides the BackInt related information in the target instance. So after the system DB is restored to a target instance, run the pre-registration script again. Only then the subsequent tenant DB restores will succeed.
+In multiple container databases for HANA, the standard configuration is SYSTEMDB + 1 or more Tenant DBs. Restore of an entire SAP HANA instance restores both SYSTEMDB and Tenant DBs. One restores SYSTEMDB first and then proceeds for Tenant DB. System DB essentially means to override the system information on the selected target. This restore also overrides the BackInt related information in the target instance. So after the system DB is restored to a target instance, run the pre-registration script again. Only then the subsequent tenant DB restores will succeed.
## Back up a replicated VM ### Scenario 1
-The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built to simulate the old VM. That is, the settings are exactly the same. (This is because the original VM was deleted and the restore was done from VM backup or Azure Site Recovery).
+The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built to simulate the old VM. That is, the settings are exactly the same. (It's because the original VM was deleted and the restore was done from VM backup or Azure Site Recovery).
This scenario could include two possible cases. Learn how to back up the replicated VM in both of these cases:
This scenario could include two possible cases. Learn how to back up the replica
- a different name than the deleted VM - the same name as the deleted VM but is in a different resource group or subscription (as compared to the deleted VM)
- If this is the case, then do the following steps:
+ If so, then follow these steps:
- The extension is already present on the VM, but isn't visible to any of the services - Run the pre-registration script
- - If you discover and protect the new databases, you'll start seeing duplicate active databases in the portal. To avoid this, [Stop protection with retain data](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) for the old databases. Then continue with the remaining steps.
- - Discover the databases to enable backup
+ - If you discover and protect the new databases, you start seeing duplicate active databases in the portal. To avoid this, [Stop protection with retain data](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) for the old databases. Then continue with the remaining steps.
+ - Discover the databases
- Enable backups on these databases
- - The already existing backed up databases (from the deleted VM) will continue to be stored in the vault (with their backups being retained according to the policy)
+ - The already existing backed-up databases (from the deleted VM) continue to be stored in the vault. They're stored with their backups being retained according to the policy.
### Scenario 2
-The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built out of the content ΓÇô to be used as a template. This is a new VM with a new SID.
+The original VM was replicated using Azure Site Recovery or Azure VM backup. The new VM was built out of the content ΓÇö to be used as a template. The VM is new with a new SID.
-Follow these steps to enable backups on the new VM:
+Follow these steps to enable backups on the new VM:
- The extension is already present on the VM, but not visible to any of the services - Run the pre-registration script. Based on the SID of the new VM, two scenarios can arise:
- - The original VM and the new VM have the same SID. The pre-registration script will run successfully.
- - The original VM and the new VM have different SIDs. The pre-registration script will fail. Contact support to get help in this scenario.
+ - The original VM and the new VM have the same SID. The pre-registration script runs successfully.
+ - The original VM and the new VM have different SIDs. The pre-registration script fails. Contact support to get help in this scenario.
- Discover the databases that you want to back up - Enable backups on these databases
Upgrades to the OS, SDC version change, or MDC version change that don't cause a
- Ensure that the new OS version, SDC, or MDC version are currently [supported by Azure Backup](sap-hana-backup-support-matrix.md#scenario-support) - [Stop protection with retain data](sap-hana-db-manage.md#stop-protection-for-an-sap-hana-database) for the database - Perform the upgrade or update-- Rerun the pre-registration script. Often, the upgrade process may remove [the necessary roles](tutorial-backup-sap-hana-db.md#what-the-pre-registration-script-does). Running the pre-registration script will help verify all the required roles.
+- Rerun the pre-registration script. Often, the upgrade process might remove [the necessary roles](tutorial-backup-sap-hana-db.md#what-the-pre-registration-script-does). Run the pre-registration script to verify all the required roles.
- Resume protection for the database again ## SDC to MDC upgrade with no change in SID
Upgrades from SDC to MDC that don't cause a SID change can be handled as follows
- Rerun the [pre-registration script](https://aka.ms/scriptforpermsonhana) - Re-register the extension for the same machine in the Azure portal (**Backup** -> **View details** -> Select the relevant Azure VM -> Re-register) - Select **Rediscover DBs** for the same VM. This action should show the new DBs in step 3 as SYSTEMDB and Tenant DB, not SDC-- The older SDC database will continue to exist in the vault and have the old backed-up data retained according to the policy
+- The older SDC database continues to exist in the vault and have the old backed-up data retained according to the policy
- Configure backup for these databases ## SDC to MDC upgrade with a change in SID
Upgrades from SDC to MDC that cause a SID change can be handled as follows:
- Ensure that the new MDC version is currently [supported by Azure Backup](sap-hana-backup-support-matrix.md#scenario-support) - **Stop protection with retain data** for the old SDC database
+- Move the _config.json_ file located at _/opt/msawb/etc/config/SAPHana/_.
- Perform the upgrade. After completion, the HANA system is now MDC with a system DB and tenant DBs-- Rerun the [pre-registration script](https://aka.ms/scriptforpermsonhana) with correct details (new SID and MDC). Due to a change in SID, you may face issues with successfully running the script. Contact Azure Backup support if you face issues.
+- Rerun the [pre-registration script](https://aka.ms/scriptforpermsonhana) with correct details (new SID and MDC). Due to a change in SID, you might face issues with successful execution of the script. Contact Azure Backup support if you face issues.
- Re-register the extension for the same machine in the Azure portal (**Backup** -> **View details** -> Select the relevant Azure VM -> Re-register) - Select **Rediscover DBs** for the same VM. This action should show the new DBs in step 3 as SYSTEMDB and Tenant DB, not SDC-- The older SDC database will continue to exist in the vault and have old backed up data retained according to the policy
+- The older SDC database continues to exist in the vault and have old backed up data retained according to the policy
- Configure backup for these databases ## Re-registration failures
Check for one or more of the following symptoms before you trigger the re-regist
- The VM is shut down, so backups can't take place - Network issues
-These symptoms may arise for one or more of the following reasons:
+These symptoms might arise for one or more of the following reasons:
- An extension was deleted or uninstalled from the portal. - The VM was restored back in time through in-place disk restore. - The VM was shut down for an extended period, so the extension configuration on it expired.-- The VM was deleted, and another VM was created with the same name and in the same resource group as the deleted VM.
+- The VM was deleted. Also, the other VM was created with the same name and in the same resource group as the deleted VM.
In the preceding scenarios, we recommend that you trigger a re-register operation on the VM. ## Next steps -- Review the [frequently asked questions](./sap-hana-faq-backup-azure-vm.yml) about backing up SAP HANA databases on Azure VMs.
+- Review the [frequently asked questions](./sap-hana-faq-backup-azure-vm.yml) about the back up of SAP HANA databases on Azure VMs.
bastion Connect Native Client Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/bastion/connect-native-client-windows.md
Title: 'Connect to a VM using the native Windows client and Azure Bastion'
+ Title: 'Connect to a VM using the native client and Azure Bastion'
-description: Learn how to connect to a VM from a Windows computer by using Bastion and the native Windows client.
+description: Learn how to connect to a VM from a Windows computer by using Bastion and the native client.
Previously updated : 12/01/2021 Last updated : 01/31/2022 # Connect to a VM using Bastion and the native client on your workstation (Preview)
-Azure Bastion now offers support for connecting to target VMs in Azure using a native RDP or SSH client on your local workstation. This feature lets you connect to your target VMs via Bastion using Azure CLI and expands your sign-in options to include local SSH key pair and Azure Active Directory (Azure AD). This article helps you configure Bastion with the required settings, and then connect to a VM in the VNet. For more information, see the [What is Azure Bastion?](bastion-overview.md).
+Azure Bastion offers support for connecting to target VMs in Azure using a native RDP or SSH client on your local workstation. This feature lets you connect to your target VMs via Bastion using Azure CLI and expands your sign-in options to include local SSH key pair, and Azure Active Directory (Azure AD). This article helps you configure Bastion with the required settings, and then connect to a VM in the VNet. For more information, see the [What is Azure Bastion?](bastion-overview.md).
> [!NOTE] > This configuration requires the Standard SKU for Azure Bastion.
Azure Bastion now offers support for connecting to target VMs in Azure using a n
Currently, this feature has the following limitations:
-* Signing in using an SSH private key stored in Azure Key Vault is not supported with this feature. Download your private key to a file on your local machine before signing in to your Linux VM using an SSH key pair.
+* Signing in using an SSH private key stored in Azure Key Vault isnΓÇÖt supported with this feature. Download your private key to a file on your local machine before signing in to your Linux VM using an SSH key pair.
## <a name="prereq"></a>Prerequisites
-Before you begin, verify that you have met the following criteria:
+Before you begin, verify that youΓÇÖve met the following criteria:
* The latest version of the CLI commands (version 2.32 or later) is installed. For information about installing the CLI commands, see [Install the Azure CLI](/cli/azure/install-azure-cli) and [Get Started with Azure CLI](/cli/azure/get-started-with-azure-cli). * An Azure virtual network. * A virtual machine in the virtual network. * If you plan to sign in to your virtual machine using your Azure AD credentials, make sure your virtual machine is set up using one of the following methods:
- * Enable Azure AD login for a [Windows VM](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Linux VM](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+ * Enable Azure AD sign-in for a [Windows VM](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Linux VM](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
* [Configure your Windows VM to be Azure AD-joined](../active-directory/devices/concept-azure-ad-join.md). * [Configure your Windows VM to be hybrid Azure AD-joined](../active-directory/devices/concept-azure-ad-join-hybrid.md).
Verify that the following roles and ports are configured in order to connect.
* Reader role on the virtual machine. * Reader role on the NIC with private IP of the virtual machine. * Reader role on the Azure Bastion resource.
-* Virtual Machine Administrator Login or Virtual Machine User Login role, if you are using the Azure AD sign-in method. You only need to do this if you're enabling Azure AD login using the process described in this article: [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+* Virtual Machine Administrator Login or Virtual Machine User Login role, if youΓÇÖre using the Azure AD sign-in method. You only need to do this if you're enabling Azure AD login using the process described in this article: [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md) or [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
### Ports To connect to a Linux VM using native client support, you must have the following ports open on your Linux VM: * Inbound port: SSH (22) *or*
-* Inbound port: Custom value (you will then need to specify this custom port when you connect to the VM via Azure Bastion)
+* Inbound port: Custom value (youΓÇÖll then need to specify this custom port when you connect to the VM via Azure Bastion)
To connect to a Windows VM using native client support, you must have the following ports open on your Windows VM: * Inbound port: RDP (3389) *or*
-* Inbound port: Custom value (you will then need to specify this custom port when you connect to the VM via Azure Bastion)
+* Inbound port: Custom value (youΓÇÖll then need to specify this custom port when you connect to the VM via Azure Bastion)
## <a name="connect"></a>Connect to a VM from a Windows local workstation
This section helps you connect to your virtual machine from a Windows local work
> If you want to specify a custom port value, you should also include the field **--resource-port** in the sign-in command. >
- * If you are signing in to an Azure AD login-enabled VM, use the following command. To learn more about how to use Azure AD to sign in to your Azure Linux VMs, see [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
+ * If youΓÇÖre signing in to an Azure AD login-enabled VM, use the following command. For more information, see [Azure Linux VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-linux.md).
```azurecli-interactive az network bastion ssh --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --auth-type "AAD" ```
- * If you are signing in using an SSH key pair, use the following command.
+ * If youΓÇÖre signing in using an SSH key pair, use the following command.
```azurecli-interactive az network bastion ssh --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --auth-type "ssh-key" --username "<Username>" --ssh-key "<Filepath>" ```
- * If you are signing in using a local username and password, use the following command. You will then be prompted for the password for the target VM.
+ * If youΓÇÖre signing in using a local username and password, use the following command. YouΓÇÖll then be prompted for the password for the target VM.
```azurecli-interactive az network bastion ssh --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --auth-type "password" --username "<Username>" ``` > [!NOTE]
- > VM sessions using the **az network bastion ssh** command do not support file transfer. To use file transfer with SSH over Bastion, please see the section on the **az network bastion tunnel** command further below.
+ > VM sessions using the **az network bastion ssh** command do not support file transfer. To use file transfer with SSH over Bastion, see the [az network bastion tunnel](#connect-tunnel) section.
> ### <a name="connect-windows"></a>Connect to a Windows VM
This section helps you connect to your virtual machine from a Windows local work
> If you want to specify a custom port value, you should also include the field **--resource-port** in the sign-in command. >
- * To connect via RDP, use the following command. You will then be prompted to input your credentials. You can use either a local username and password or your Azure AD credentials. To learn more about how to use Azure AD to sign in to your Azure Windows VMs, see [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md).
+ * To connect via RDP, use the following command. YouΓÇÖll then be prompted to input your credentials. You can use either a local username and password or your Azure AD credentials. For more information, see [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md).
```azurecli-interactive az network bastion rdp --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>"
This section helps you connect to your virtual machine from a Windows local work
## <a name="connect-tunnel"></a>Connect to a VM using the *az network bastion tunnel* command This section helps you connect to your virtual machine using the *az network bastion tunnel* command, which allows you to:
-* Use native clients on *non*-Windows local workstations (ex: a Linux PC)
-* Use a native client of your choice
-* Set up concurrent VM sessions with Bastion
-* Access file transfer for SSH sessions
+* Use native clients on *non*-Windows local workstations (ex: a Linux PC).
+* Use a native client of your choice.
+* Set up concurrent VM sessions with Bastion.
+* Access file transfer for SSH sessions.
1. Sign in to your Azure account and select your subscription containing your Bastion resource.
This section helps you connect to your virtual machine using the *az network bas
```azurecli-interactive az network bastion tunnel --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --resource-port "<TargetVMPort>" --port "<LocalMachinePort>" ```
-3. Connect and log in to your target VM using SSH or RDP, the native client of your choice, and the local machine port you specified in Step 2.
+3. Connect and sign in to your target VM using SSH or RDP, the native client of your choice, and the local machine port you specified in Step 2.
## Next steps
bastion Vm Upload Download Native https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/bastion/vm-upload-download-native.md
+
+ Title: 'Upload and download files - native client'
+
+description: Learn how to upload and download files using Azure Bastion and a native client.
+++++ Last updated : 01/31/2022+
+# Customer intent: I want to upload or download files using Bastion.
+++
+# Upload and download files using the native client: Azure Bastion (Preview)
+
+Azure Bastion offers support for file transfer between your target VM and local computer using Bastion and a native RDP or SSH client. To learn more about native client support, refer to [Connect to a VM using the native client](connect-native-client-windows.md). You can use either SSH or RDP to upload files to a VM from your local computer. To download files from a VM, you must use RDP.
+
+> [!NOTE]
+> Uploading and downloading files is supported using the native client only. You can't upload and download files using PowerShell or via the Azure portal.
+>
+
+## Upload and download files - RDP
+
+This section helps you transfer files between your local Windows computer and your target VM over RDP. Once connected to the target VM, you can transfer files using right-click, then **Copy** and **Paste**.
+
+1. Sign in to your Azure account and select your subscription containing your Bastion resource.
+
+ ```azurecli-interactive
+ az login
+ az account list
+ az account set --subscription "<subscription ID>"
+ ```
+
+1. Sign in to your target VM via RDP using the following command. You can use either a local username and password, or your Azure AD credentials. To learn more about how to use Azure AD to sign in to your Azure Windows VMs, see [Azure Windows VMs and Azure AD](../active-directory/devices/howto-vm-sign-in-azure-ad-windows.md).
+
+ ```azurecli-interactive
+ az network bastion rdp --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>"
+ ```
+
+1. Once you sign in to your target VM, the native client on your computer will open up with your VM session. You can now transfer files between your VM and local machine using right-click, then **Copy** and **Paste**.
+
+## Upload files - SSH
+
+This section helps you upload files from your local computer to your target VM over SSH using the *az network bastion tunnel* command. To learn more about the tunnel command, refer to [Connect to a VM using the *az network bastion tunnel* command](connect-native-client-windows.md#connect-tunnel).
+
+> [!NOTE]
+> File download over SSH is not currently supported.
+>
+
+1. Sign in to your Azure account and select your subscription containing your Bastion resource.
+
+ ```azurecli-interactive
+ az login
+ az account list
+ az account set --subscription "<subscription ID>"
+ ```
+
+1. Open the tunnel to your target VM using the following command:
+
+ ```azurecli-interactive
+ az network bastion tunnel --name "<BastionName>" --resource-group "<ResourceGroupName>" --target-resource-id "<VMResourceId>" --resource-port "<TargetVMPort>" --port "<LocalMachinePort>"
+ ```
+
+1. Upload files to your local machine to your target VM using the following command:
+
+ ```azurecli-interactive
+ scp -P <LocalMachinePort> <local machine file path> <username>@127.0.0.1:<target VM file path>
+ ```
+
+1. Connect to your target VM using SSH, the native client of your choice, and the local machine port you specified in Step 3. For example, you can use the following command if you have the OpenSSH client installed on your local computer:
+
+ ```azurecli-interactive
+ ssh <username>@127.0.0.1 -p <LocalMachinePort>
+ ```
+
+## Next steps
+
+- Read the [Bastion FAQ](bastion-faq.md)
cognitive-services Bing Autosuggest Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/bing-autosuggest-upgrade-guide-v5-to-v7.md
Last updated 02/20/2019
# Autosuggest API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Autosuggest API. Use this guide to help update your application to use version 7.
Blocked|InvalidRequest.Blocked
## Next steps > [!div class="nextstepaction"]
-> [Use and display requirements](../bing-web-search/use-display-requirements.md)
+> [Use and display requirements](../bing-web-search/use-display-requirements.md)
cognitive-services Get Suggestions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/concepts/get-suggestions.md
# Suggesting query terms
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
+ Typically, you'd call the Bing Autosuggest API each time a user types a new character in your application's search box. The completeness of the query string impacts the relevance of the suggested query terms that the API returns. The more complete the query string, the more relevant the list of suggested query terms are. For example, the suggestions that the API may return for `s` are likely to be less relevant than the queries it returns for `sailing dinghies`.
If the user selects a suggested query from the drop-down list, you'd use the que
## Next steps
-* [What is the Bing Autosuggest API?](../get-suggested-search-terms.md)
+* [What is the Bing Autosuggest API?](../get-suggested-search-terms.md)
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/concepts/sending-requests.md
Last updated 06/27/2019
# Sending requests to the Bing Autosuggest API.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
If your application sends queries to any of the Bing Search APIs, you can use the Bing Autosuggest API to improve your users' search experience. The Bing Autosuggest API returns a list of suggested queries based on the partial query string in the search box. As characters are entered into a search box in your application, you can display suggestions in a drop-down list. Use this article to learn more about sending requests to this API.
BingAPIs-Market: en-US
- [What is Bing Autosuggest?](../get-suggested-search-terms.md) - [Bing Autosuggest API v7 reference](/rest/api/cognitiveservices-bingsearch/bing-autosuggest-api-v7-reference)-- [Getting suggested search terms from the Bing Autosuggest API](get-suggestions.md)
+- [Getting suggested search terms from the Bing Autosuggest API](get-suggestions.md)
cognitive-services Get Suggested Search Terms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/get-suggested-search-terms.md
Last updated 12/18/2019
# What is Bing Autosuggest?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
If your application sends queries to any of the Bing Search APIs, you can use the Bing Autosuggest API to improve your users' search experience. The Bing Autosuggest API returns a list of suggested queries based on the partial query string in the search box. As characters are entered into the search box, you can display suggestions in a drop-down list.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/language-support.md
Last updated 02/20/2019
# Language and region support for the Bing Autosuggest API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The following lists the languages supported by Bing Autosuggest API.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/client-libraries.md
# Quickstart: Use the Bing Autosuggest client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/csharp.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple C# application sends a partial search query to the API, and returns suggestions for searches. While this application is written in C#, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/dotnet/Search/BingAutosuggestv7.cs).
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/java.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Java application sends a partial search query to the API, and returns suggestions for searches. While this application is written in Java, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/java/Search/BingAutosuggestv7.java)
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/nodejs.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Node.js application sends a partial search query to the API, and returns suggestions for searches. While this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/nodejs/Search/BingAutosuggestv7.js)
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/php.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple PHP application sends a partial search query to the API, and returns suggestions for searches. While this application is written in PHP, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/python.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Python application sends a partial search query to the API, and returns suggestions for searches. While this application is written in Python, the API is a RESTful Web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/python/Search/BingAutosuggestv7.py)
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/quickstarts/ruby.md
# Quickstart: Suggest search queries with the Bing Autosuggest REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Follow this quickstart to learn how to make calls to the Bing Autosuggest API and read the JSON response. This simple Ruby application sends a partial search query to the API, and returns suggestions for searches. While this application is written in Ruby, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Autosuggest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Autosuggest/tutorials/autosuggest.md
# Tutorial: Get search suggestions on a web page
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
In this tutorial, we'll build a Web page that allows users to query the Bing Autosuggest API.
cognitive-services Call Endpoint Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-csharp.md
# Quickstart: Call your Bing Custom Search endpoint using C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in C#, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/dotnet/Search/BingCustomSearchv7.cs).
cognitive-services Call Endpoint Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-java.md
# Quickstart: Call your Bing Custom Search endpoint using Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in Java, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/java/Search/BingCustomSearchv7.java).
cognitive-services Call Endpoint Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-nodejs.md
# Quickstart: Call your Bing Custom Search endpoint using Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in JavaScript, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/nodejs/Search/BingCustomSearchv7.js).
cognitive-services Call Endpoint Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/call-endpoint-python.md
# Quickstart: Call your Bing Custom Search endpoint using Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to request search results from your Bing Custom Search instance. Although this application is written in Python, the Bing Custom Search API is a RESTful web service compatible with most programming languages. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/python/Search/BingCustomSearchv7.py).
cognitive-services Define Custom Suggestions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/define-custom-suggestions.md
# Configure your custom autosuggest experience
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Custom Autosuggest returns a list of suggested search query strings that are relevant to your search experience. The suggested query strings are based on a partial query string that the user provides in the search box. The list will contain a maximum of 10 suggestions.
cognitive-services Define Your Custom View https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/define-your-custom-view.md
# Configure your Bing Custom Search experience
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
A Custom Search instance lets you tailor the search experience to include content only from websites that your users care about. Instead of performing a web-wide search, Bing searches only the slices of the web that interest you. To create your custom view of the web, use the Bing Custom Search [portal](https://www.customsearch.ai).
cognitive-services Endpoint Custom https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/endpoint-custom.md
# Custom Search
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Search enables you to create tailored search experiences for topics that you care about. Your users see search results tailored to the content they care about instead of having to page through search results that have irrelevant content. ## Custom Search Endpoint
cognitive-services Get Images From Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/get-images-from-instance.md
Last updated 09/10/2018
# Get images from your custom view
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Images Search lets you enrich your custom search experience with images. Similar to web results, custom search supports searching for images in your instance's list of websites. You can get the images using the Bing Custom Images Search API or through the Hosted UI feature. Using the Hosted UI feature is simple to use and recommended for getting your search experience up and running in short order. For information about configuring your Hosted UI to include images, see [Configure your hosted UI experience](hosted-ui.md).
cognitive-services Get Videos From Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/get-videos-from-instance.md
Last updated 09/10/2018
# Get videos from your custom view
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Videos Search lets you enrich your custom search experience with videos. Similar to web results, custom search supports searching for videos in your instance's list of websites. You can get the videos using the Bing Custom Videos Search API or through the Hosted UI feature. Using the Hosted UI feature is simple to use and recommended for getting your search experience up and running in short order. For information about configuring your Hosted UI to include videos, see [Configure your hosted UI experience](hosted-ui.md).
cognitive-services Hosted Ui https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/hosted-ui.md
# Configure your hosted UI experience
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Search provides a hosted UI that you can easily integrate into your webpages and web applications as a JavaScript code snippet. Using the Bing Custom Search portal, you can configure the layout, color, and search options of the UI.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/language-support.md
# Language and region support for the Bing Custom Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Custom Search API supports more than three dozen countries/regions, many with more than one language.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/overview.md
# What is the Bing Custom Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Custom Search API enables you to create tailored ad-free search experiences for topics that you care about. You can specify the domains and webpages for Bing to search, as well as pin, boost, or demote specific content to create a custom view of the web and help your users quickly find relevant search results.
cognitive-services Quick Start https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/quick-start.md
# Quickstart: Create your first Bing Custom Search instance
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
To use Bing Custom Search, you need to create a custom search instance that defines your view or slice of the web. This instance contains the public domains, websites, and webpages that you want to search, along with any ranking adjustments you may want.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Custom Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Search Your Custom View https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/search-your-custom-view.md
# Call your Bing Custom Search instance from the Portal
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
After you've configured your custom search experience, you can test it from within the Bing Custom Search [portal](https://customsearch.ai).
cognitive-services Share Your Custom Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/share-your-custom-search.md
# Share your Custom Search instance
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
You can easily allow collaborative editing and testing of your instance by sharing it with members of your team. You can share your instance with anyone using just their email address. To share an instance:
cognitive-services Custom Search Web Page https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Custom-Search/tutorials/custom-search-web-page.md
# Tutorial: Build a Custom Search web page
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Custom Search enables you to create tailored search experiences for topics that you care about. For example, if you own a martial arts website that provides a search experience, you can specify the domains, sub-sites, and webpages that Bing searches. Your users see search results tailored to the content they care about instead of paging through general search results that may contain irrelevant content.
cognitive-services Search For Entities https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/concepts/search-for-entities.md
# Searching for entities with the Bing Entity API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
## Suggest search terms with the Bing Autosuggest API
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/concepts/sending-requests.md
# Sending search requests to the Bing Entity Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API sends a search query to Bing and gets results that include entities and places. Place results include restaurants, hotel, or other local businesses. For places, the query can specify the name of the local business or it can ask for a list (for example, restaurants near me). Entity results include persons, places, or things. Place in this context is tourist attractions, states, countries/regions, etc.
cognitive-services Entity Search Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/entity-search-endpoint.md
# Bing Entity Search API endpoint
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API has one endpoint that returns entities from the Web based on a query. These search results are returned in JSON.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/overview.md
Last updated 12/18/2019
# What is Bing Entity Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API sends a search query to Bing and gets results that include entities and places. Place results include restaurants, hotel, or other local businesses. Bing returns places if the query specifies the name of the local business or asks for a type of business (for example, restaurants near me). Bing returns entities if the query specifies well-known people, places (tourist attractions, states, countries/regions, etc.), or things.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Entity Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/csharp.md
# Quickstart: Send a search request to the Bing Entity Search REST API using C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple C# application sends a news search query to the API, and displays the response. The source code for this application is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/dotnet/Search/BingEntitySearchv7.cs).
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/java.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple Java application sends a news search query to the API, and displays the response.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/nodejs.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple JavaScript application sends a news search query to the API, and displays the response. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/nodejs/Search/BingEntitySearchv7.js).
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/php.md
# Quickstart: Send a search request to the Bing Entity Search REST API using PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple PHP application sends a news search query to the API, and displays the response.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/python.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple Python application sends a news search query to the API, and displays the response. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/python/Search/BingEntitySearchv7.py).
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/quickstarts/ruby.md
# Quickstart: Send a search request to the Bing Entity Search REST API using Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Entity Search API and view the JSON response. This simple Ruby application sends a news search query to the API, and displays the response. The source code for this application is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/ruby/Search/BingEntitySearchv7.rb).
cognitive-services Rank Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/rank-results.md
# Using ranking to display entity search results
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Each entity search response includes a [RankingResponse](/rest/api/cognitiveservices/bing-web-api-v7-reference#rankingresponse) answer that specifies how you must display search results returned by the Bing Entity Search API. The ranking response groups results into pole, mainline, and sidebar content. The pole result is the most important or prominent result and should be displayed first. If you do not display the remaining results in a traditional mainline and sidebar format, you must provide the mainline content higher visibility than the sidebar content.
cognitive-services Tutorial Bing Entities Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Entities-Search/tutorial-bing-entities-search-single-page-app.md
# Tutorial: Single-page web app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Entity Search API lets you search the Web for information about *entities* and *places.* You may request either kind of result, or both, in a given query. The definitions of places and entities are provided below.
cognitive-services Bing News Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/bing-news-upgrade-guide-v5-to-v7.md
Last updated 01/10/2019
# News Search API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing News Search API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Search For News https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/concepts/search-for-news.md
Last updated 12/18/2019
# Search for news with the Bing News Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Image Search API makes it easy to integrate Bing's cognitive news searching capabilities into your applications.
cognitive-services Send Search Queries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/concepts/send-search-queries.md
# Sending queries to the Bing News Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API enables you to search the web for relevant news items. Use this article to learn more about sending search queries to the API.
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/csharp.md
# Quickstart: Search for news using C# and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple C# application sends a news search query to the API, and displays the JSON response.
cognitive-services Endpoint News https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/endpoint-news.md
# Bing News Search API endpoints
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The **News Search API** returns news articles, Web pages, images, videos, and [entities](../bing-entities-search/overview.md). Entities contain summary information about a person, place, or topic.
cognitive-services Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/go.md
# Quickstart: Get news results using the Bing News Search REST API and Go
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This quickstart uses the Go language to call the Bing News Search API. The results include names and URLs of news sources identified by the query string.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/java.md
# Quickstart: Perform a news search using Java and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple Java application sends a news search query to the API, and displays the JSON response.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/language-support.md
# Language and region support for the Bing News Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API supports numerous countries/regions, many with more than one language. Specifying a country/region with a query serves primarily to refine search results based on interests in that country/region. Additionally, the results may contain links to Bing, and these links may localize the Bing user experience according to the specified country/region or language.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/nodejs.md
# Quickstart: Perform a news search using Node.js and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple JavaScript application sends a search query to the API and displays the JSON response.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/php.md
# Quickstart: Perform a news search using PHP and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple PHP application sends a search query to the API and displays the JSON response.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/python.md
# Quickstart: Perform a news search using Python and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple Python application sends a search query to the API and processes the JSON result.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing News Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/ruby.md
# Quickstart: Perform a news search using Ruby and the Bing News Search REST API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing News Search API. This simple Ruby application sends a search query to the API and processes the JSON response.
cognitive-services Search The Web https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/search-the-web.md
# What is the Bing News Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API makes it easy to integrate Bing's cognitive news searching capabilities into your applications. The API provides a similar experience to [Bing News](https://www.bing.com/news), letting you send search queries and receive relevant news articles.
cognitive-services Tutorial Bing News Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-News-Search/tutorial-bing-news-search-single-page-app.md
# Tutorial: Create a single-page web app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing News Search API lets you search the Web and obtain results of the news type relevant to a search query. In this tutorial, we build a single-page Web application that uses the Bing News Search API to display search results on the page. The application includes HTML, CSS, and JavaScript components. The source code for this sample is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/Tutorials/BingNewsSearchApp.html).
cognitive-services Bing Spell Check Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/bing-spell-check-upgrade-guide-v5-to-v7.md
Last updated 02/20/2019
# Spell Check API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Spell Check API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/concepts/sending-requests.md
# Sending requests to the Bing Spell Check API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
To check a text string for spelling and grammar errors, you'd send a GET request to the following endpoint:
cognitive-services Using Spell Check https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/concepts/using-spell-check.md
# Using the Bing Spell Check API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this article to learn about using the Bing Spell Check API to perform contextual grammar and spell checking. While most spell-checkers rely on dictionary-based rule sets, the Bing spell-checker leverages machine learning and statistical machine translation to provide accurate and contextual corrections.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/language-support.md
# Language and region support for Bing Spell Check API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
These languages are supported by the Bing Spell Check API (only in `spell` mode).
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/overview.md
# What is the Bing Spell Check API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Spell Check API enables you to perform contextual grammar and spell checking on text. While most spell-checkers rely on dictionary-based rule sets, the Bing spell-checker leverages machine learning and statistical machine translation to provide accurate and contextual corrections.
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/csharp.md
# Quickstart: Check spelling with the Bing Spell Check REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple C# application sends a request to the API and returns a list of suggested corrections.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/java.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple Java application sends a request to the API and returns a list of suggested corrections.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/nodejs.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple JavaScript application sends a request to the API and returns a list of suggested corrections.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/php.md
# Quickstart: Check spelling with the Bing Spell Check REST API and PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple PHP application sends a request to the API and returns a list of suggested corrections.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/python.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API. This simple Python application sends a request to the API and returns a list of suggested corrections.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/quickstarts/ruby.md
# Quickstart: Check spelling with the Bing Spell Check REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Spell Check REST API using Ruby. This simple application sends a request to the API and returns a list of suggested corrections.
cognitive-services Sdk Quickstart Spell Check https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/sdk-quickstart-spell-check.md
# Quickstart: Check spelling with the Bing Spell Check SDK for C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to begin spell checking with the Bing Spell Check SDK for C#. While Bing Spell Check has a REST API compatible with most programming languages, the SDK provides an easy way to integrate the service into your applications. The source code for this sample can be found on [GitHub](https://github.com/Azure-Samples/cognitive-services-dotnet-sdk-samples/tree/master/samples/SpellCheck).
cognitive-services Spellcheck https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Spell-Check/tutorials/spellcheck.md
# Tutorial: Build a Web page Spell Check client
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
In this tutorial, we'll build a Web page that allows users to query the Bing Spell Check API. The source code for this application is available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/Tutorials/BingSpellCheckApp.html).
cognitive-services Bing Video Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/bing-video-upgrade-guide-v5-to-v7.md
Last updated 01/31/2019
# Video Search API upgrade guide
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Video Search API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Get Videos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/concepts/get-videos.md
# Search for videos with the Bing Video Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API makes it easy to integrate Bing's cognitive news searching capabilities into your applications. While the API primarily finds and returns relevant videos from the web, it provides several features for intelligent and focused video retrieval on the web.
cognitive-services Sending Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/concepts/sending-requests.md
# Sending search requests to the Bing Video Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This article describes the parameters and attributes of requests sent to the Bing Video Search API, as well as the JSON response object it returns.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/overview.md
Last updated 12/18/2019
# What is the Bing Video Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API makes it easy to add video searching capabilities to your services and applications. By sending user search queries with the API, you can get and display relevant and high-quality videos similar to [Bing Video](https://www.bing.com/video). Use this API for search results that only contain videos. The [Bing Web Search API](../bing-web-search/overview.md) can return other types of web content, including webpages, videos, news and images.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Video Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/csharp.md
# Quickstart: Search for videos using the Bing Video Search REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple C# application sends an HTTP video search query to the API and displays the JSON response. Although this application is written in C#, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/java.md
# Quickstart: Search for videos using the Bing Video Search REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple Java application sends an HTTP video search query to the API and displays the JSON response. Although this application is written in Java, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/nodejs.md
# Quickstart: Search for videos using the Bing Video Search REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple JavaScript application sends an HTTP video search query to the API, and displays the JSON response. Although this application is written in JavaScript and uses Node.js, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/php.md
# Quickstart: Search for videos using the Bing Video Search REST API and PHP
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple PHP application sends an HTTP video search query to the API, and displays the JSON response. The example code is written to work under PHP 5.6.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/python.md
# Quickstart: Search for videos using the Bing Video Search REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple Python application sends an HTTP video search query to the API, and displays the JSON response. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/quickstarts/ruby.md
# Quickstart: Search for videos using the Bing Video Search REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Video Search API. This simple Ruby application sends an HTTP video search query to the API, and displays the JSON response. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Trending Videos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/trending-videos.md
Last updated 01/31/2019
# Get trending videos with the Bing Video Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API enables you to find today's trending videos from across the web, and in different categories.
cognitive-services Tutorial Bing Video Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/tutorial-bing-video-search-single-page-app.md
# Tutorial: Single-page Video Search app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Video Search API lets you search the Web and get video results relevant to a search query. In this tutorial, we build a single-page Web application that uses the Bing search API to display search results on the page. The application includes HTML, CSS, and JavaScript components. <!-- Remove until it can be replaced with a sanitized version.
cognitive-services Video Insights https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Video-Search/video-insights.md
Last updated 01/31/2019
# Get insights about a video
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Each video returned by the Bing Video Search API includes a video ID that you can use to get more information about it, such as related videos. To get insights about a video, get its [videoId](/rest/api/cognitiveservices-bingsearch/bing-video-api-v7-reference#video-videoid) token in the API response.
cognitive-services Autosuggest Bing Search Terms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/autosuggest-bing-search-terms.md
# Autosuggest Bing search terms in your application
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
If you provide a search box where the user enters their search term, use the [Bing Autosuggest API](../bing-autosuggest/get-suggested-search-terms.md) to improve the experience. The API returns suggested query strings based on partial search terms as the user types.
cognitive-services Bing Api Comparison https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/bing-api-comparison.md
# What are the Bing Search APIs?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Search APIs let you build web-connected apps and services that find webpages, images, news, locations, and more without advertisements. By sending search requests using the Bing Search REST APIs or SDKs, you can get relevant information and content for web searches. Use this article to learn about the different Bing search APIs and how you can integrate cognitive searches into your applications and services. Pricing and rate limits may vary between APIs.
cognitive-services Bing Web Stats https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/bing-web-stats.md
# Add analytics to the Bing Search APIs
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Statistics provides analytics for the Bing Search APIs. These analytics include call volume, top query strings, geographic distribution, and more. You can enable Bing Statistics in the [Azure portal](https://ms.portal.azure.com) by navigating to your Azure resource and clicking **Enable Bing Statistics**.
cognitive-services Bing Web Upgrade Guide V5 To V7 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/bing-web-upgrade-guide-v5-to-v7.md
Last updated 02/12/2019
# Upgrade from Bing Web Search API v5 to v7
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This upgrade guide identifies the changes between version 5 and version 7 of the Bing Web Search API. Use this guide to help you identify the parts of your application that you need to update to use version 7.
cognitive-services Csharp Ranking Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/csharp-ranking-tutorial.md
# Build a console app search client in C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This tutorial shows how to build a simple .NET Core console app that allows users to query the Bing Web Search API and display ranked results.
cognitive-services Filter Answers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/filter-answers.md
Last updated 07/08/2019
# Filtering the answers that the search response includes
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
When you query the web, Bing returns all the relevant content it finds for the search. For example, if the search query is "sailing+dinghies", the response might contain the following answers:
cognitive-services Hit Highlighting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/hit-highlighting.md
Last updated 07/30/2019
# Using decoration markers to highlight text
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing supports hit highlighting, which marks query terms (or other terms that Bing finds relevant) in the display strings of some answers. For example, a webpage result's `name`, `displayUrl`, and `snippet` fields might contain marked query terms.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/language-support.md
# Language and region support for the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Web Search API supports over three dozen countries or regions, many with more than one language. Specifying a country or region with a query helps refine search results based on that country or regions interests. The results may include links to Bing, and these links may localize the Bing user experience according to the specified country/region or language.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/overview.md
# What is the Bing Web Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Web Search API is a RESTful service that provides instant answers to user queries. Search results are easily configured to include web pages, images, videos, news, translations, and more. Bing Web Search provides the results as JSON based on search relevance and your Bing Web Search subscriptions.
cognitive-services Paging Search Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/paging-search-results.md
# How to page through results from the Bing Search APIs
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
When you send a call to the Bing Web, Custom, Image, News or Video Search APIs, Bing returns a subset of the total number of results that may be relevant to the query. To get the estimated total number of available results, access the answer object's `totalEstimatedMatches` field.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/client-libraries.md
# Quickstart: Use a Bing Web Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/csharp.md
# Quickstart: Search the web using the Bing Web Search REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This C# application sends a search request to the API, and shows the JSON response. Although this application is written in C#, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/go.md
# Quickstart: Search the web using the Bing Web Search REST API and Go
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Go application sends a search request to the API, and shows the JSON response. Although this application is written in Go, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/java.md
# Quickstart: Use Java to search the web with the Bing Web Search REST API, an Azure cognitive service
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
In this quickstart, you'll use a Java application to make your first call to the Bing Web Search API. This Java application sends a search request to the API, and shows the JSON response. Although this application is written in Java, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/nodejs.md
# Quickstart: Search the web using the Bing Web Search REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Node.js application sends a search request to the API, and shows the JSON response. Although this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/php.md
# Quickstart: Use PHP to call the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Node.js application sends a search request to the API, and shows the JSON response. Although this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/python.md
# Quickstart: Use Python to call the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Python application sends a search request to the API, and shows the JSON response. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/quickstarts/ruby.md
# Quickstart: Use Ruby to call the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Web Search API. This Ruby application sends a search request to the API, and shows the JSON response. Although this application is written in Ruby, the API is a RESTful Web service compatible with most programming languages.
Use this code to make a request and handle the response:
```ruby # Construct the endpoint uri.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
uri = URI(uri + path + "?q=" + URI.escape(term)) puts "Searching the Web for: " + term # Create the request.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
request = Net::HTTP::Get.new(uri) request['Ocp-Apim-Subscription-Key'] = accessKey # Get the response.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
response = Net::HTTP.start(uri.host, uri.port, :use_ssl => uri.scheme == 'https') do |http| http.request(request) end
cognitive-services Rank Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/rank-results.md
Last updated 03/17/2019
# How to use ranking to display Bing Web Search API results
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Each search response includes a [RankingResponse](/rest/api/cognitiveservices-bingsearch/bing-web-api-v7-reference#rankingresponse) answer, that specifies how you must display the search results. The ranking response groups results by mainline content and sidebar content for a traditional search results page. If you do not display the results in a traditional mainline and sidebar format, you must provide the mainline content higher visibility than the sidebar content.
cognitive-services Resize And Crop Thumbnails https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/resize-and-crop-thumbnails.md
# Resize and crop thumbnail images
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Some answers from the Bing Search APIs include URLs to thumbnail images served by Bing, which you can resize and crop, and may contain query parameters. For example:
cognitive-services Sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/sdk-samples.md
# Bing Web Search SDK samples
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Web Search SDK is available in Python, Node.js, C#, and Java. Code samples, prerequisites, and build instructions are provided on GitHub. The following scenarios are covered:
cognitive-services Search Responses https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/search-responses.md
# Bing Web Search API response structure and answer types
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
When you send Bing Web Search a search request, it returns a [`SearchResponse`](/rest/api/cognitiveservices-bingsearch/bing-web-api-v7-reference#searchresponse) object in the response body. The object includes a field for each answer that Bing determined was relevant to query. This example illustrates a response object if Bing returned all answers:
cognitive-services Throttling Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/throttling-requests.md
# Throttling requests to the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
[!INCLUDE [cognitive-services-bing-throttling-requests](../../../includes/cognitive-services-bing-throttling-requests.md)]
cognitive-services Tutorial Bing Web Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/tutorial-bing-web-search-single-page-app.md
# Tutorial: Create a single-page app using the Bing Web Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This single-page app demonstrates how to retrieve, parse, and display search results from the Bing Web Search API. The tutorial uses boilerplate HTML and CSS, and focuses on the JavaScript code. HTML, CSS, and JS files are available on [GitHub](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/tree/master/Tutorials/Bing-Web-Search) with quickstart instructions.
cognitive-services Use Display Requirements https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/use-display-requirements.md
# Bing Search API use and display requirements
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
These use and display requirements apply to any implementation of the content and associated information from the following Bing Search APIs, including relationships, metadata, and other signals.
cognitive-services Web Search Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Bing-Web-Search/web-search-endpoints.md
# Web Search endpoint
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The **Web Search API** returns Web pages, news, images, videos, and [entities](../bing-entities-search/overview.md). Entities have summary information about a person, place, or topic.
cognitive-services Luis Reference Regions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-reference-regions.md
The authoring region app can only be published to a corresponding publish region
| Asia | `westus`<br>[www.luis.ai][www.luis.ai]| East Asia<br>`eastasia` | `https://eastasia.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Japan East<br>`japaneast` | `https://japaneast.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Japan West<br>`japanwest` | `https://japanwest.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` |
+| Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Jio India West<br>`jioindiawest` | `https://jioindiawest.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` |
| Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Korea Central<br>`koreacentral` | `https://koreacentral.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| Southeast Asia<br>`southeastasia` | `https://southeastasia.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` | | Asia | `westus`<br>[www.luis.ai][www.luis.ai]| North UAE<br>`northuae` | `https://northuae.api.cognitive.microsoft.com/luis/v2.0/apps/YOUR-APP-ID?subscription-key=YOUR-SUBSCRIPTION-KEY` |
cognitive-services Add Sharepoint Datasources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/add-sharepoint-datasources.md
If the QnA Maker knowledge base manager is not the Active Directory manager, you
## Add supported file types to knowledge base
-You can add all QnA Maker-supported [file types](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types) from a SharePoint site to your knowledge base. You may have to grant [permissions](#permissions) if the file resource is secured.
+You can add all QnA Maker-supported [file types](/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types) from a SharePoint site to your knowledge base. You may have to grant [permissions](#permissions) if the file resource is secured.
1. From the library with the SharePoint site, select the file's ellipsis menu, `...`. 1. Copy the file's URL.
Use the **@microsoft.graph.downloadUrl** from the previous section as the `fileu
## Next steps > [!div class="nextstepaction"]
-> [Collaborate on your knowledge base](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types.yml)
+> [Collaborate on your knowledge base](/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types.yml)
cognitive-services Audio Processing Speech Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/audio-processing-speech-sdk.md
Title: Using the Microsoft Audio Stack (MAS) - Speech service
+ Title: Use the Microsoft Audio Stack (MAS) - Speech service
description: An overview of the features, capabilities, and restrictions for audio processing using the Speech Software Development Kit (SDK).
Previously updated : 12/27/2021 Last updated : 01/31/2022 ms.devlang: cpp, csharp, java
-# Using the Microsoft Audio Stack (MAS)
+# Use the Microsoft Audio Stack (MAS)
The Speech SDK integrates Microsoft Audio Stack (MAS), allowing any application or product to use its audio processing capabilities on input audio. See the [Audio processing](audio-processing-overview.md) documentation for an overview.
-In this article, you learn how to use the Speech SDK to leverage the Microsoft Audio Stack (MAS).
+In this article, you learn how to use the Microsoft Audio Stack (MAS) with the Speech SDK.
## Default options
SpeechRecognizer recognizer = new SpeechRecognizer(speechConfig, audioInput);
## Preset microphone geometry This sample shows how to use MAS with a predefined microphone geometry on a specified audio input device. In this example:
-* **Enhancement options** - The default enhancements will be applied on the input audio stream.
+* **Enhancement options** - The default enhancements are applied on the input audio stream.
* **Preset geometry** - The preset geometry represents a linear 2-microphone array. * **Audio input device** - The audio input device ID is `hw:0,1`. For more information on how to select an audio input device, see [How to: Select an audio input device with the Speech SDK](how-to-select-audio-input-devices.md).
SpeechRecognizer recognizer = new SpeechRecognizer(speechConfig, audioInput);
This sample shows how to use MAS with a custom microphone geometry on a specified audio input device. In this example: * **Enhancement options** - The default enhancements will be applied on the input audio stream.
-* **Custom geometry** - A custom microphone geometry for a 7-microphone array is provided by specifying the microphone coordinates. The units for coordinates are millimeters.
-* **Audio input** - The audio input is from a file, where the audio within the file is expected to be captured from an audio input device corresponding to the custom geometry specified.
+* **Custom geometry** - A custom microphone geometry for a 7-microphone array is provided via the microphone coordinates. The units for coordinates are millimeters.
+* **Audio input** - The audio input is from a file, where the audio within the file is expected from an audio input device corresponding to the custom geometry specified.
### [C#](#tab/csharp)
This sample shows how to use MAS with a custom microphone geometry and beamformi
* **Enhancement options** - The default enhancements will be applied on the input audio stream. * **Custom geometry** - A custom microphone geometry for a 4-microphone array is provided by specifying the microphone coordinates. The units for coordinates are millimeters. * **Beamforming angles** - Beamforming angles are specified to optimize for audio originating in that range. The units for angles are degrees. In the sample code below, the start angle is set to 70 degrees and the end angle is set to 110 degrees.
-* **Audio input** - The audio input is from a push stream, where the audio within the stream is expected to be captured from an audio input device corresponding to the custom geometry specified.
+* **Audio input** - The audio input is from a push stream, where the audio within the stream is expected from an audio input device corresponding to the custom geometry specified.
### [C#](#tab/csharp)
SpeechRecognizer recognizer = new SpeechRecognizer(speechConfig, audioInput);
Microsoft Audio Stack requires the reference channel (also known as loopback channel) to perform echo cancellation. The source of the reference channel varies by platform: * **Windows** - The reference channel is automatically gathered by the Speech SDK if the `SpeakerReferenceChannel::LastChannel` option is provided when creating `AudioProcessingOptions`.
-* **Linux** - ALSA (Advanced Linux Sound Architecture) will need to be configured to provide the reference audio stream as the last channel for the audio input device that will be used. This is in addition to providing the `SpeakerReferenceChannel::LastChannel` option when creating `AudioProcessingOptions`.
+* **Linux** - ALSA (Advanced Linux Sound Architecture) must be configured to provide the reference audio stream as the last channel for the audio input device used. ALSA is configured in addition to providing the `SpeakerReferenceChannel::LastChannel` option when creating `AudioProcessingOptions`.
## Language and platform support
-| Language | Platform(s) | Reference docs |
+| Language | Platform | Reference docs |
||-|-| | C++ | Windows, Linux | [C++ docs](/cpp/cognitive-services/speech/) | | C# | Windows, Linux | [C# docs](/dotnet/api/microsoft.cognitiveservices.speech) |
cognitive-services Get Started Speech Translation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/get-started-speech-translation.md
Title: Speech translation quickstart - Speech service
-description: Learn how to use the Speech SDK to translate speech. In this quickstart, you learn about object construction, supported audio input formats, and configuration options for speech translation.
+description: Learn how to use the Speech SDK to translate speech, including object construction, supported audio input formats, and configuration options.
keywords: speech translation
## Next steps
-* [Use codec compressed audio formats](how-to-use-codec-compressed-audio-input-streams.md)
+* Use [codec-compressed audio formats](how-to-use-codec-compressed-audio-input-streams.md).
* See the [quickstart samples](https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart) on GitHub.
cognitive-services How To Use Custom Entity Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-use-custom-entity-pattern-matching.md
Title: How to use custom entity pattern matching with the C++ Speech SDK
+ Title: How to use custom entity pattern matching with the Speech SDK
description: In this guide, you learn how to recognize intents and custom entities from simple patterns.
Last updated 11/15/2021 -
+ms.devlang: cpp, csharp
+zone_pivot_groups: programming-languages-set-nine
+
-# How to use custom entity pattern matching with the C++ Speech SDK
+# How to use custom entity pattern matching with the Speech SDK
The Cognitive Services [Speech SDK](speech-sdk.md) has a built-in feature to provide **intent recognition** with **simple language pattern matching**. An intent is something the user wants to do: close a window, mark a checkbox, insert some text, etc.
-In this guide, you use the Speech SDK to develop a C++ console application that derives intents from speech utterances spoken through your device's microphone. You'll learn how to:
+In this guide, you use the Speech SDK to develop a console application that derives intents from speech utterances spoken through your device's microphone. You'll learn how to:
> [!div class="checklist"] >
In this guide, you use the Speech SDK to develop a C++ console application that
## When should you use this?
-Use this sample code if:
-* You are only interested in matching very strictly what the user said. These patterns match more aggressively than LUIS.
-* You do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents. This can be helpful since it is embedded within the SDK.
-* You cannot or do not want to create a LUIS app but you still want some voice-commanding capability.
+Use this sample code if:
-If you do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents, this can be helpful since it is embedded within the SDK.
+- You are only interested in matching very strictly what the user said. These patterns match more aggressively than LUIS.
+- You do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents. This can be helpful since it is embedded within the SDK.
+- You cannot or do not want to create a LUIS app but you still want some voice-commanding capability.
+If you do not have access to a [LUIS](../LUIS/index.yml) app, but still want intents, this can be helpful since it is embedded within the SDK.
## Prerequisites
Be sure you have the following items before you begin this guide:
[!INCLUDE [Pattern Matching Overview](includes/pattern-matching-overview.md)]
-## Create a speech project in Visual Studio
--
-## Open your project in Visual Studio
-
-Next, open your project in Visual Studio.
-
-1. Launch Visual Studio 2019.
-2. Load your project and open `helloworld.cpp`.
-
-## Start with some boilerplate code
-
-Let's add some code that works as a skeleton for our project.
-
-```cpp
- #include <iostream>
- #include <speechapi_cxx.h>
-
- using namespace Microsoft::Cognitive
- using namespace Microsoft::Cognitive
-
- int main()
- {
- std::cout << "Hello World!\n";
-
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- }
-```
-
-## Create a Speech configuration
-
-Before you can initialize an `IntentRecognizer` object, you need to create a configuration that uses the key and Azure region for your Cognitive Services prediction resource.
-
-* Replace `"YOUR_SUBSCRIPTION_KEY"` with your Cognitive Services prediction key.
-* Replace `"YOUR_SUBSCRIPTION_REGION"` with your Cognitive Services resource region.
-
-This sample uses the `FromSubscription()` method to build the `SpeechConfig`. For a full list of available methods, see [SpeechConfig Class](/cpp/cognitive-services/speech/speechconfig).
-
-## Initialize an IntentRecognizer
-
-Now create an `IntentRecognizer`. Insert this code right below your Speech configuration.
-
-```cpp
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-```
-
-## Add some intents
-
-You need to associate some patterns with a `PatternMatchingModel` and apply it to the `IntentRecognizer`.
-We will start by creating a `PatternMatchingModel` and adding a few intents to it. A PatternMatchingIntent is a struct so we will just use the in-line syntax.
-
-> [!Note]
-> We can add multiple patterns to an `Intent`.
-
-```cpp
- auto model = PatternMatchingModel::FromId("myNewModel");
-
- model->Intents.push_back({"Take me to floor {floorName}.", "Go to floor {floorName}."} , "ChangeFloors");
- model->Intents.push_back({"{action} the door."}, "OpenCloseDoor");
-```
-
-## Add some custom entities
-
-To take full advantage of the pattern matcher you can customize your entities. We will make "floorName" a list of the available floors.
-
-```cpp
- model->Entities.push_back({ "floorName" , Intent::EntityType::List, Intent::EntityMatchMode::Strict, {"one","two", "lobby", "ground floor"} });
-```
-
-## Apply our model to the Recognizer
-
-Now it is necessary to apply the model to the `IntentRecognizer`. It is possible to use multiple models at once so the API takes a collection of models.
-
-```cpp
- std::vector<std::shared_ptr<LanguageUnderstandingModel>> collecton;
-
- collection.push_back(model);
- intentRecognizer->ApplyLanguageModels(collection);
-```
-
-## Recognize an intent
-
-From the `IntentRecognizer` object, you're going to call the `RecognizeOnceAsync()` method. This method asks the Speech service to recognize speech in a single phrase, and stop recognizing speech once the phrase is identified. For simplicity we'll wait on the future returned to complete.
-
-Insert this code below your intents:
-
-```cpp
- std::cout << "Say something ..." << std::endl;
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-```
-
-## Display the recognition results (or errors)
-
-When the recognition result is returned by the Speech service, let's just print the result.
-
-Insert this code below `auto result = intentRecognizer->RecognizeOnceAsync().get();`:
-
-```cpp
-switch (result->Reason)
-{
-case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
-case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
-case ResultReason::NoMatch:
-{
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized" << std::endl;
- break;
- }
- break;
-}
-case ResultReason::Canceled:
-{
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
-}
-default:
- break;
-}
-```
-
-## Check your code
-
-At this point, your code should look like this:
-
-```cpp
-#include <iostream>
-#include <speechapi_cxx.h>
-
-using namespace Microsoft::Cognitive
-using namespace Microsoft::Cognitive
-
-int main()
-{
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-
- auto model = PatternMatchingModel::FromId("myNewModel");
-
- model->Intents.push_back({"Take me to floor {floorName}.", "Go to floor {floorName}."} , "ChangeFloors");
- model->Intents.push_back({"{action} the door."}, "OpenCloseDoor");
-
- model->Entities.push_back({ "floorName" , Intent::EntityType::List, Intent::EntityMatchMode::Strict, {"one","two", "lobby", "ground floor"} });
-
- std::vector<std::shared_ptr<LanguageUnderstandingModel>> collecton;
-
- collection.push_back(model);
- intentRecognizer->ApplyLanguageModels(collection);
-
- std::cout << "Say something ..." << std::endl;
-
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-
- switch (result->Reason)
- {
- case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
- case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
- case ResultReason::NoMatch:
- {
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized." << std::endl;
- break;
- }
- break;
- }
- case ResultReason::Canceled:
- {
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
- }
- default:
- break;
- }
-}
-```
-## Build and run your app
-
-Now you're ready to build your app and test our speech recognition using the Speech service.
-
-1. **Compile the code** - From the menu bar of Visual Studio, choose **Build** > **Build Solution**.
-2. **Start your app** - From the menu bar, choose **Debug** > **Start Debugging** or press <kbd>F5</kbd>.
-3. **Start recognition** - It will prompt you to say something. The default language is English. Your speech is sent to the Speech service, transcribed as text, and rendered in the console.
-
-For example if you say "Take me to floor 2", this should be the output:
-
-```
-Say something ...
-RECOGNIZED: Text = Take me to floor 2.
- Intent Id = ChangeFloors
- Floor name: = 2
-```
-
-Another example if you say "Take me to floor 7", this should be the output:
-
-```
-Say something ...
-RECOGNIZED: Text = Take me to floor 7.
-NO INTENT RECOGNIZED!
-```
-The Intent ID is empty because 7 was not in our list.
cognitive-services How To Use Simple Language Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-use-simple-language-pattern-matching.md
Last updated 11/15/2021 -
+zone_pivot_groups: programming-languages-set-nine
+ # How to use simple language pattern matching with the C++ Speech SDK
A pattern is a phrase that includes an Entity somewhere within it. An Entity is
Take me to the {floorName} ```
-This defines an Entity with the ID "floorName" which is case sensitive.
+This defines an Entity with the ID "floorName" which is case-sensitive.
All other special characters and punctuation will be ignored. Intents will be added using calls to the IntentRecognizer->AddIntent() API.
-## Create a speech project in Visual Studio
--
-Open your project in Visual Studio
-Next, open your project in Visual Studio.
-
-Launch Visual Studio 2019.
-Load your project and open helloworld.cpp.
-Start with some boilerplate code
-Let's add some code that works as a skeleton for our project. Make note that you've created an async method called recognizeIntent().
-
-## Open your project in Visual Studio
-
-Next, open your project in Visual Studio.
-
-1. Launch Visual Studio 2019.
-2. Load your project and open `helloworld.cpp`.
-
-## Start with some boilerplate code
-
-Let's add some code that works as a skeleton for our project.
-
-```cpp
- #include <iostream>
- #include <speechapi_cxx.h>
-
- using namespace Microsoft::Cognitive
- using namespace Microsoft::Cognitive
-
- int main()
- {
- std::cout << "Hello World!\n";
-
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- }
-```
-
-## Create a Speech configuration
-
-Before you can initialize an `IntentRecognizer` object, you need to create a configuration that uses the key and location for your Cognitive Services prediction resource.
-
-* Replace `"YOUR_SUBSCRIPTION_KEY"` with your Cognitive Services prediction key.
-* Replace `"YOUR_SUBSCRIPTION_REGION"` with your Cognitive Services resource region.
-
-This sample uses the `FromSubscription()` method to build the `SpeechConfig`. For a full list of available methods, see [SpeechConfig Class](/cpp/cognitive-services/speech/speechconfig).
-
-## Initialize an IntentRecognizer
-
-Now create an `IntentRecognizer`. Insert this code right below your Speech configuration.
-
-```cpp
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-```
-
-## Add some intents
-
-You need to associate some patterns with the `IntentRecognizer` by calling `AddIntent()`.
-We will add 2 intents with the same ID for changing floors, and another intent with a separate ID for opening and closing doors.
-
-```cpp
- intentRecognizer->AddIntent("Take me to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("Go to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("{action} the door.", "OpenCloseDoor");
-```
-
-> [!NOTE]
-> There is no limit to the number of entities you can declare, but they will be loosely matched. If you add a phrase like "{action} door" it will match any time there is text before the word "door". Intents are evaluated based on their number of entities. If two patterns would match, the one with more defined entities is returned.
-
-## Recognize an intent
-
-From the `IntentRecognizer` object, you're going to call the `RecognizeOnceAsync()` method. This method asks the Speech service to recognize speech in a single phrase, and stop recognizing speech once the phrase is identified. For simplicity we'll wait on the future returned to complete.
-
-Insert this code below your intents:
-
-```cpp
- std::cout << "Say something ..." << std::endl;
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-```
-
-## Display the recognition results (or errors)
-
-When the recognition result is returned by the Speech service, let's just print the result.
-
-Insert this code below `auto result = intentRecognizer->RecognizeOnceAsync().get();`:
-
-```cpp
-switch (result->Reason)
-{
-case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
-case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
-case ResultReason::NoMatch:
-{
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized" << std::endl;
- break;
- }
- break;
-}
-case ResultReason::Canceled:
-{
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
-}
-default:
- break;
-}
-```
-
-## Check your code
-
-At this point, your code should look like this:
-
-```cpp
-#include <iostream>
-#include <speechapi_cxx.h>
-
-using namespace Microsoft::Cognitive
-using namespace Microsoft::Cognitive
-
-int main()
-{
- auto config = SpeechConfig::FromSubscription("YOUR_SUBSCRIPTION_KEY", "YOUR_SUBSCRIPTION_REGION");
- auto intentRecognizer = IntentRecognizer::FromConfig(config);
-
- intentRecognizer->AddIntent("Take me to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("Go to floor {floorName}.", "ChangeFloors");
- intentRecognizer->AddIntent("{action} the door.", "OpenCloseDoor");
-
- std::cout << "Say something ..." << std::endl;
-
- auto result = intentRecognizer->RecognizeOnceAsync().get();
-
- switch (result->Reason)
- {
- case ResultReason::RecognizedSpeech:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << "NO INTENT RECOGNIZED!" << std::endl;
- break;
- case ResultReason::RecognizedIntent:
- std::cout << "RECOGNIZED: Text = " << result->Text.c_str() << std::endl;
- std::cout << " Intent Id = " << result->IntentId.c_str() << std::endl;
- auto entities = result->GetEntities();
- if (entities.find("floorName") != entities.end())
- {
- std::cout << " Floor name: = " << entities["floorName"].c_str() << std::endl;
- }
-
- if (entities.find("action") != entities.end())
- {
- std::cout << " Action: = " << entities["action"].c_str() << std::endl;
- }
-
- break;
- case ResultReason::NoMatch:
- {
- auto noMatch = NoMatchDetails::FromResult(result);
- switch (noMatch->Reason)
- {
- case NoMatchReason::NotRecognized:
- std::cout << "NOMATCH: Speech was detected, but not recognized." << std::endl;
- break;
- case NoMatchReason::InitialSilenceTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only silence, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::InitialBabbleTimeout:
- std::cout << "NOMATCH: The start of the audio stream contains only noise, and the service timed out waiting for speech." << std::endl;
- break;
- case NoMatchReason::KeywordNotRecognized:
- std::cout << "NOMATCH: Keyword not recognized." << std::endl;
- break;
- }
- break;
- }
- case ResultReason::Canceled:
- {
- auto cancellation = CancellationDetails::FromResult(result);
-
- if (!cancellation->ErrorDetails.empty())
- {
- std::cout << "CANCELED: ErrorDetails=" << cancellation->ErrorDetails.c_str() << std::endl;
- std::cout << "CANCELED: Did you update the subscription info?" << std::endl;
- }
- }
- default:
- break;
- }
-}
-```
-## Build and run your app
-
-Now you're ready to build your app and test our speech recognition using the Speech service.
-
-1. **Compile the code** - From the menu bar of Visual Studio, choose **Build** > **Build Solution**.
-2. **Start your app** - From the menu bar, choose **Debug** > **Start Debugging** or press <kbd>F5</kbd>.
-3. **Start recognition** - It will prompt you to say something. The default language is English. Your speech is sent to the Speech service, transcribed as text, and rendered in the console.
-
-For example if you say "Take me to floor 7", this should be the output:
-
-```
-Say something ...
-RECOGNIZED: Text = Take me to floor 7.
- Intent Id = ChangeFloors
- Floor name: = 7
-```
-
-## Next steps
-
-> Improve your pattern matching by using [custom entities](how-to-use-custom-entity-pattern-matching.md).
cognitive-services Improve Accuracy Phrase List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/improve-accuracy-phrase-list.md
+
+ Title: Improve recognition accuracy with phrase list
+description: Phrase lists can be used to customize speech recognition results based on context.
+++++ Last updated : 01/26/2022
+zone_pivot_groups: programming-languages-set-two-with-js-spx
++
+# Improve recognition accuracy with phrase list
+
+A phrase list is a list of words or phrases provided ahead of time to help improve their recognition. Adding a phrase to a phrase list increases its importance, thus making it more likely to be recognized.
+
+Examples of phrases include:
+* Names
+* Geographical locations
+* Homonyms
+* Words or acronyms unique to your industry or organization
+
+Phrase lists are simple and lightweight:
+- **Just-in-time**: A phrase list is provided just before starting the speech recognition, eliminating the need to train a custom model.
+- **Lightweight**: You don't need a large data set. Simply provide a word or phrase to give it importance.
+
+You can use the Speech SDK or Speech Command Line Interface (CLI). The Batch transcription API does not support phrase lists.
+
+There are some situations where [training a custom model](custom-speech-overview.md) that includes phrases is likely the best option to improve accuracy. In these cases you would not use a phrase list:
+- If you need to use a large list of phrases. A phrase list shouldn't have more than 500 phrases.
+- If you need a phrase list for languages that are not currently supported. For supported phrase list locales see [Language and voice support for the Speech service](language-support.md#phrase-list).
+- If you use a custom endpoint. Phrase lists can't be used with custom endpoints.
+
+## Try it in Speech Studio
+
+You can use Speech Studio to test how phrase list would help improve recognition for your audio. To implement a phrase list with your application in production, you'll use the Speech SDK or Speech CLI.
+
+For example, let's say that you want the Speech service to recognize this sentence:
+"Hi Rehaan, this is Jessie from Contoso bank. "
+
+After testing, you might find that it's incorrectly recognized as:
+"Hi **everyone**, this is **Jesse** from **can't do so bank**."
+
+In this case you would want to add "Rehaan", "Jessie", and "Contoso" to your phrase list. Then the names should be recognized correctly.
+
+Now try Speech Studio to see how phrase list can improve recognition accuracy.
+
+> [!NOTE]
+> You may be prompted to select your Azure subscription and Speech resource, and then acknowledge billing for your region. If you are new to Azure or Speech, see [Try the Speech service for free](overview.md#try-the-speech-service-for-free).
+
+1. Sign in to [Speech Studio](https://speech.microsoft.com/).
+1. Select **Real-time Speech-to-text**.
+1. You test speech recognition by uploading an audio file or recording audio with a microphone. For example, select **record audio with a microphone** and then say "Hi Rehaan, this is Jessie from Contoso bank. " Then select the red button to stop recording.
+1. You should see the transcription result in the **Test results** text box. If "Rehaan", "Jesse", or "Contoso" were not recognized, you can add the terms to a phrase list in the next step.
+1. Select **Show advanced options** and turn on **Phrase list**.
+1. Enter "Contoso;Jessie;Rehaan" in the phrase list text box. Note that multiple phrases need to be separated by a semicolon.
+ :::image type="content" source="./media/custom-speech/phrase-list-after-zoom.png" alt-text="Screenshot of a phrase list applied in Speech Studio." lightbox="./media/custom-speech/phrase-list-after-full.png":::
+1. Use the microphone to test recognition again. Otherwise you can select the retry arrow next to your audio file to re-run your audio. The terms "Rehaan", "Jesse", or "Contoso" should be recognized.
+
+## Implement phrase list
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```csharp
+var phraseList = PhraseListGrammar.FromRecognizer(recognizer);
+phraseList.AddPhrase("Contoso;Jessie;Rehaan");
+phraseList.Clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```cpp
+auto phraseListGrammar = PhraseListGrammar::FromRecognizer(recognizer);
+phraseListGrammar->AddPhrase("Contoso;Jessie;Rehaan");
+phraseListGrammar->Clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```java
+PhraseListGrammar phraseList = PhraseListGrammar.fromRecognizer(recognizer);
+phraseList.addPhrase("Contoso;Jessie;Rehaan");
+phraseList.clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```javascript
+const phraseList = sdk.PhraseListGrammar.fromRecognizer(recognizer);
+phraseList.addPhrase("Contoso;Jessie;Rehaan");
+phraseList.clear();
+```
+
+With the [Speech SDK](speech-sdk.md) you add phrases and then run speech recognition. Then you can optionally clear or update the phrase list to take effect before the next recognition.
+
+```Python
+phrase_list_grammar = speechsdk.PhraseListGrammar.from_recognizer(reco)
+phrase_list_grammar.addPhrase("Contoso;Jessie;Rehaan")
+phrase_list_grammar.clear()
+```
+
+With the [Speech CLI](spx-overview.md) you can include a phrase list along with the recognize command.
+
+# [Terminal](#tab/terminal)
+
+Try recognition from a microphone or an audio file.
+
+```console
+spx recognize --microphone --phrases "Contoso;Jessie;Rehaan;"
+spx recognize --file "your\path\to\audio.wav" --phrases "Contoso;Jessie;Rehaan;"
+```
+
+You can also add a phrase list using a text file that contains one phrase per line
+
+```console
+spx recognize --microphone --phrases @phrases.txt
+spx recognize --file "your\path\to\audio.wav" --phrases @phrases.txt
+```
+
+# [PowerShell](#tab/powershell)
+
+Try recognition from a microphone or an audio file.
+
+```powershell
+spx --% recognize --microphone --phrases "Contoso;Jessie;Rehaan;"
+spx --% recognize --file "your\path\to\audio.wav" --phrases "Contoso;Jessie;Rehaan;"
+```
+
+You can also add a phrase list using a text file that contains one phrase per line
+
+```powershell
+spx --% recognize --microphone --phrases @phrases.txt
+spx --% recognize --file "your\path\to\audio.wav" --phrases @phrases.txt
+```
+
+***
++
+## Next steps
+
+Check out more options to improve recognition accuracy.
+
+> [!div class="nextstepaction"]
+> [Custom Speech](custom-speech-overview.md)
+
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/language-support.md
To improve accuracy, customization is available for some languages and baseline
| Turkish (Turkey) | `tr-TR` | Plain text | | Vietnamese (Vietnam) | `vi-VN` | Plain text |
+### Phrase list
+
+You can use the locales in this table with [phrase list](improve-accuracy-phrase-list.md).
+
+| Language | Locale |
+|||
+| Chinese (Mandarin, Simplified) | `zh-CN` |
+| English (Australia) | `en-AU` |
+| English (Canada) | `en-CA` |
+| English (India) | `en-IN` |
+| English (United Kingdom)) | `en-GB` |
+| English (United States) | `en-US` |
+| French (France) | `fr-FR` |
+| German (Germany) | `de-DE` |
+| Italian (Italy) | `it-IT` |
+| Japanese (Japan) | `ja-JP` |
+| Portuguese (Brazil) | `pt-BR` |
+| Spanish (Spain) | `es-ES` |
+ ## Text-to-speech Both the Microsoft Speech SDK and REST APIs support these neural voices, each of which supports a specific language and dialect, identified by locale. You can also get a full list of languages and voices supported for each specific region or endpoint through the [voices list API](rest-text-to-speech.md#get-a-list-of-voices).
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/overview.md
To add a Speech service resource to your Azure account by using the free or paid
1. Give a unique name for your new resource. The name helps you distinguish among multiple subscriptions tied to the same service. 1. Choose the Azure subscription that the new resource is associated with to determine how the fees are billed. Here's the introduction for [how to create an Azure subscription](../../cost-management-billing/manage/create-subscription.md#create-a-subscription-in-the-azure-portal) in the Azure portal.
- 1. Choose the [region](regions.md) where the resource will be used. Azure is a global cloud platform that's generally available in many regions worldwide. To get the best performance, select a region thatΓÇÖs closest to you or where your application runs. The Speech service availabilities vary among different regions. Make sure that you create your resource in a supported region. For more information, see [region support for Speech services](./regions.md#speech-to-text-text-to-speech-and-translation).
+ 1. Choose the [region](regions.md) where the resource will be used. Azure is a global cloud platform that's generally available in many regions worldwide. To get the best performance, select a region that's closest to you or where your application runs. The Speech service availabilities vary among different regions. Make sure that you create your resource in a supported region. For more information, see [region support for Speech services](./regions.md#speech-to-text-text-to-speech-and-translation).
1. Choose either a free (F0) or paid (S0) pricing tier. For complete information about pricing and usage quotas for each tier, select **View full pricing details** or see [Speech services pricing](https://azure.microsoft.com/pricing/details/cognitive-services/speech-services/). For limits on resources, see [Azure Cognitive Services limits](../../azure-resource-manager/management/azure-subscription-service-limits.md#azure-cognitive-services-limits). 1. Create a new resource group for this Speech subscription or assign the subscription to an existing resource group. Resource groups help you keep your various Azure subscriptions organized. 1. Select **Create**. This action takes you to the deployment overview and displays deployment progress messages.
After you've had a chance to get started with the Speech service, try our tutori
- [Tutorial: Recognize intents from speech with the Speech SDK and LUIS, C#](how-to-recognize-intents-from-speech-csharp.md) - [Tutorial: Voice enable your bot with the Speech SDK, C#](tutorial-voice-enable-your-bot-speech-sdk.md)-- [Tutorial: Build a Flask app to translate text, analyze sentiment, and synthesize translated text to speech, REST](../translator/tutorial-build-flask-app-translation-synthesis.md?bc=%2fazure%2fcognitive-services%2fspeech-service%2fbreadcrumb%2ftoc.json%252c%2fen-us%2fazure%2fbread%2ftoc.json&toc=%2fazure%2fcognitive-services%2fspeech-service%2ftoc.json%252c%2fen-us%2fazure%2fcognitive-services%2fspeech-service%2ftoc.json)
+- [Tutorial: Build a Flask app to translate text, analyze sentiment, and synthesize translated text to speech, REST](/learn/modules/python-flask-build-ai-web-app/)
## Get sample code
cognitive-services Releasenotes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/releasenotes.md
See below for information about changes to Speech services and resources.
## What's new?
+* Speech SDK 1.20.0 released January 2022. Updates include extended programming language support for DialogServiceConnector, Unity on Linux, enhancements to IntentRecognizer, added support for Python 3.10, and a fix to remove a 10-second delay while stopping a speech recognizer (when using a PushAudioInputStream, and no new audio is pushed in after StopContinuousRecognition is called).
+* Speech CLI 1.20.0 released January 2022. Updates include microphone input for Speaker recognition and expanded support for Intent recognition.
* Speaker Recognition service is generally available (GA). With [Speaker Recognition](./speaker-recognition-overview.md) you can accurately verify and identify speakers by their unique voice characteristics.
-* Speech SDK 1.19.0 release including Speaker Recognition support, Mac M1 ARM support, OpenSSL linking in Linux is dynamic, and Ubuntu 16.04 is no longer supported.
* Custom Neural Voice extended to support [49 locales](./language-support.md#custom-neural-voice). * Prebuilt Neural Voice added new [languages and variants](./language-support.md#prebuilt-neural-voices). * Commitment Tiers added to [pricing options](https://azure.microsoft.com/pricing/details/cognitive-services/speech-services/).
See below for information about changes to Speech services and resources.
[!INCLUDE [speech-cli](./includes/release-notes/release-notes-cli.md)]
-# [Text-to-speech](#tab/text-to-speech)
+# [Text-to-speech service](#tab/text-to-speech)
[!INCLUDE [text-to-speech](./includes/release-notes/release-notes-tts.md)]
-# [Speech-to-text](#tab/speech-to-text)
+# [Speech-to-text service](#tab/speech-to-text)
[!INCLUDE [speech-to-text](./includes/release-notes/release-notes-stt.md)]
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/language-support.md
Previously updated : 10/28/2021 Last updated : 02/01/2022 # Translator language support
**Dictionary:** Use the [Dictionary Lookup](reference/v3-0-dictionary-lookup.md) or [Dictionary Examples](reference/v3-0-dictionary-examples.md) operations from the Text Translation feature to display alternative translations from or to English and examples of words in context.
+## Translation
+ | Language | Language code | Cloud ΓÇô Text Translation and Document Translation| Containers ΓÇô Text Translation|Custom Translator|Auto Language Detection|Dictionary |:-|:-:|:-:|:-:|:-:|:-:|:-:| | Afrikaans | `af` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|
| Hungarian | `hu` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | Icelandic | `is` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | Indonesian | `id` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|
+| 🆕 </br> Inuinnaqtun | `ikt` |✔|||||
| Inuktitut | `iu` |Γ£ö|Γ£ö|Γ£ö|Γ£ö||
+| 🆕 </br> Inuktitut (Latin) | `iu-Latn` |✔|||||
| Irish | `ga` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|| | Italian | `it` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | Japanese | `ja` |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|
cognitive-services Tutorial Build Flask App Translation Synthesis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/tutorial-build-flask-app-translation-synthesis.md
- Title: "Tutorial: Build a Flask app to translate, synthesize, and analyze text - Translator"-
-description: In this tutorial, you'll build a Flask-based web app to translate text, analyze sentiment, and synthesize translated text into speech.
------ Previously updated : 10/28/2021----
-# Tutorial: Build a Flask app with Azure Cognitive Services
-
-In this tutorial, you'll build a Flask web app that uses Azure Cognitive Services to translate text, analyze sentiment, and synthesize translated text into speech. Our focus is on the Python code and Flask routes that enable our application, however, we will help you out with the HTML and JavaScript that pulls the app together. If you run into any issues let us know using the feedback button below.
-
-Here's what this tutorial covers:
-
-> [!div class="checklist"]
-> * Get Azure subscription keys
-> * Set up your development environment and install dependencies
-> * Create a Flask app
-> * Use the Translator to translate text
-> * Use the Language service to analyze positive/negative sentiment of input text and translations
-> * Use the Speech Service to convert translated text into synthesized speech
-> * Run your Flask app locally
-
-> [!TIP]
-> If you'd like to skip ahead and see all the code at once, the entire sample, along with build instructions are available on [GitHub](https://github.com/MicrosoftTranslator/Text-Translation-API-V3-Flask-App-Tutorial).
-
-## What is Flask?
-
-Flask is a microframework for creating web applications. This means Flask provides you with tools, libraries, and technologies that allow you to build a web application. This web application can be some web pages, a blog, a wiki or go as substantive as a web-based calendar application or a commercial website.
-
-For those of you who want to deep dive after this tutorial here are a few helpful links:
-
-* [Flask documentation](http://flask.pocoo.org/)
-* [Flask for Dummies - A Beginner's Guide to Flask](https://codeburst.io/flask-for-dummies-a-beginners-guide-to-flask-part-uno-53aec6afc5b1)
-
-## Prerequisites
-
-Let's review the software and subscription keys that you'll need for this tutorial.
-
-* [Python 3.6 or later](https://www.python.org/downloads/)
-* [Git tools](https://git-scm.com/downloads)
-* An IDE or text editor, such as [Visual Studio Code](https://code.visualstudio.com/) or [Atom](https://atom.io/)
-* [Chrome](https://www.google.com/chrome/browser/) or [Firefox](https://www.mozilla.org/firefox)
-* A **Translator** subscription key (you can likely use the **global** location.)
-* A **Language service** subscription key in the **West US** region.
-* A **Speech Services** subscription key in the **West US** region.
-
-## Create an account and subscribe to resources
-
-As previously mentioned, you're going to need three subscription keys for this tutorial. This means that you need to create a resource within your Azure account for:
-* Translator
-* Language service
-* Speech Services
-
-Use [Create a Cognitive Services Account in the Azure portal](../cognitive-services-apis-create-account.md) for step-by-step instructions to create resources.
-
-> [!IMPORTANT]
-> For this tutorial, please create your resources in the West US region. If using a different region, you'll need to adjust the base URL in each of your Python files.
-
-## Set up your dev environment
-
-Before you build your Flask web app, you'll need to create a working directory for your project and install a few Python packages.
-
-### Create a working directory
-
-1. Open command line (Windows) or terminal (macOS/Linux). Then, create a working directory and sub directories for your project:
-
- ```
- mkdir -p flask-cog-services/static/scripts && mkdir flask-cog-services/templates
- ```
-2. Change to your project's working directory:
-
- ```
- cd flask-cog-services
- ```
-
-### Create and activate your virtual environment with `virtualenv`
-
-Let's create a virtual environment for our Flask app using `virtualenv`. Using a virtual environment ensures that you have a clean environment to work from.
-
-1. In your working directory, run this command to create a virtual environment:
- **macOS/Linux:**
- ```
- virtualenv venv --python=python3
- ```
- We've explicitly declared that the virtual environment should use Python 3. This ensures that users with multiple Python installations are using the correct version.
-
- **Windows CMD / Windows Bash:**
- ```
- virtualenv venv
- ```
- To keep things simple, we're naming your virtual environment venv.
-
-2. The commands to activate your virtual environment will vary depending on your platform/shell:
-
- | Platform | Shell | Command |
- |-|-||
- | macOS/Linux | bash/zsh | `source venv/bin/activate` |
- | Windows | bash | `source venv/Scripts/activate` |
- | | Command Line | `venv\Scripts\activate.bat` |
- | | PowerShell | `venv\Scripts\Activate.ps1` |
-
- After running this command, your command line or terminal session should be prefaced with `venv`.
-
-3. You can deactivate the session at any time by typing this into the command line or terminal: `deactivate`.
-
-> [!NOTE]
-> Python has extensive documentation for creating and managing virtual environments, see [virtualenv](https://virtualenv.pypa.io/en/latest/).
-
-### Install requests
-
-Requests is a popular module that is used to send HTTP 1.1 requests. There's no need to manually add query strings to your URLs, or to form-encode your POST data.
-
-1. To install requests, run:
-
- ```
- pip install requests
- ```
-
-> [!NOTE]
-> If you'd like to learn more about requests, see [Requests: HTTP for Humans](https://2.python-requests.org/en/master/).
-
-### Install and configure Flask
-
-Next we need to install Flask. Flask handles the routing for our web app, and allows us to make server-to-server calls that hide our subscription keys from the end user.
-
-1. To install Flask, run:
- ```
- pip install Flask
- ```
- Let's make sure Flask was installed. Run:
- ```
- flask --version
- ```
- The version should be printed to terminal. Anything else means something went wrong.
-
-2. To run the Flask app, you can either use the flask command or Python's -m switch with Flask. Before you can do that you need to tell your terminal which app to work with by exporting the `FLASK_APP` environment variable:
-
- **macOS/Linux**:
- ```
- export FLASK_APP=app.py
- ```
-
- **Windows**:
- ```
- set FLASK_APP=app.py
- ```
-
-## Create your Flask app
-
-In this section, you're going to create a barebones Flask app that returns an HTML file when users hit the root of your app. Don't spend too much time trying to pick apart the code, we'll come back to update this file later.
-
-### What is a Flask route?
-
-Let's take a minute to talk about "[routes](http://flask.pocoo.org/docs/1.0/api/#flask.Flask.route)". Routing is used to bind a URL to a specific function. Flask uses route decorators to register functions to specific URLs. For example, when a user navigates to the root (`/`) of our web app, `https://docsupdatetracker.net/index.html` is rendered.
-
-```python
-@app.route('/')
-def index():
- return render_template('https://docsupdatetracker.net/index.html')
-```
-
-Let's take a look at one more example to hammer this home.
-
-```python
-@app.route('/about')
-def about():
- return render_template('https://docsupdatetracker.net/about.html')
-```
-
-This code ensures that when a user navigates to `http://your-web-app.com/about` that the `https://docsupdatetracker.net/about.html` file is rendered.
-
-While these samples illustrate how to render html pages for a user, routes can also be used to call APIs when a button is pressed, or take any number of actions without having to navigate away from the homepage. You'll see this in action when you create routes for translation, sentiment, and speech synthesis.
-
-### Get started
-
-1. Open the project in your IDE, then create a file named `app.py` in the root of your working directory. Next, copy this code into `app.py` and save:
-
- ```python
- from flask import Flask, render_template, url_for, jsonify, request
-
- app = Flask(__name__)
- app.config['JSON_AS_ASCII'] = False
-
- @app.route('/')
- def index():
- return render_template('https://docsupdatetracker.net/index.html')
- ```
-
- This code block tells the app to display `https://docsupdatetracker.net/index.html` whenever a user navigates to the root of your web app (`/`).
-
-2. Next, let's create the front-end for our web app. Create a file named `https://docsupdatetracker.net/index.html` in the `templates` directory. Then copy this code into `templates/https://docsupdatetracker.net/index.html`.
-
- ```html
- <!doctype html>
- <html lang="en">
- <head>
- <!-- Required metadata tags -->
- <meta charset="utf-8">
- <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
- <meta name="description" content="Translate and analyze text with Azure Cognitive Services.">
- <!-- Bootstrap CSS -->
- <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/css/bootstrap.min.css" integrity="sha384-Gn5384xqQ1aoWXA+058RXPxPg6fy4IWvTNh0E263XmFcJlSAwiGgFAW/dAiS6JXm" crossorigin="anonymous">
- <title>Translate and analyze text with Azure Cognitive Services</title>
- </head>
- <body>
- <div class="container">
- <h1>Translate, synthesize, and analyze text with Azure</h1>
- <p>This simple web app uses Azure for text translation, text-to-speech conversion, and sentiment analysis of input text and translations. Learn more about <a href="https://docs.microsoft.com/azure/cognitive-services/">Azure Cognitive Services</a>.
- </p>
- <!-- HTML provided in the following sections goes here. -->
-
- <!-- End -->
- </div>
-
- <!-- Required Javascript for this tutorial -->
- <script src="https://code.jquery.com/jquery-3.2.1.slim.min.js" integrity="sha384-KJ3o2DKtIkvYIK3UENzmM7KCkRr/rE9/Qpg6aAZGJwFDMVNA/GpGFF93hXpG5KkN" crossorigin="anonymous"></script>
- <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
- <script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.12.9/umd/popper.min.js" integrity="sha384-ApNbgh9B+Y1QKtv3Rn7W3mgPxhU9K/ScQsAP7hUibX39j7fakFPskvXusvfa0b4Q" crossorigin="anonymous"></script>
- <script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/js/bootstrap.min.js" integrity="sha384-JZR6Spejh4U02d8jOt6vLEHfe/JQGiRRSQQxSfFWpi1MquVdAyjUar5+76PVCmYl" crossorigin="anonymous"></script>
- <script type = "text/javascript" src ="static/scripts/main.js"></script>
- </body>
- </html>
- ```
-
-3. Let's test the Flask app. From the terminal, run:
-
- ```
- flask run
- ```
-
-4. Open a browser and navigate to the URL provided. You should see your single page app. Press **Ctrl + C** to kill the app.
-
-## Translate text
-
-Now that you have an idea of how a simple Flask app works, let's:
-
-* Write some Python to call the Translator and return a response
-* Create a Flask route to call your Python code
-* Update the HTML with an area for text input and translation, a language selector, and translate button
-* Write JavaScript that allows users to interact with your Flask app from the HTML
-
-### Call the Translator
-
-The first thing you need to do is write a function to call the Translator. This function will take two arguments: `text_input` and `language_output`. This function is called whenever a user presses the translate button in your app. The text area in the HTML is sent as the `text_input`, and the language selection value in the HTML is sent as `language_output`.
-
-1. Let's start by creating a file called `translate.py` in the root of your working directory.
-2. Next, add this code to `translate.py`. This function takes two arguments: `text_input` and `language_output`.
- ```python
- import os, requests, uuid, json
-
- # Don't forget to replace with your Cog Services subscription key!
- # If you prefer to use environment variables, see Extra Credit for more info.
- subscription_key = 'YOUR_TRANSLATOR_TEXT_SUBSCRIPTION_KEY'
- location = 'YOUR_TRANSLATOR_RESOURCE_LOCATION'
- # Don't forget to replace with your Cog Services location!
- # Our Flask route will supply two arguments: text_input and language_output.
- # When the translate text button is pressed in our Flask app, the Ajax request
- # will grab these values from our web app, and use them in the request.
- # See main.js for Ajax calls.
- def get_translation(text_input, language_output):
- base_url = 'https://api.cognitive.microsofttranslator.com'
- path = '/translate?api-version=3.0'
- params = '&to=' + language_output
- constructed_url = base_url + path + params
-
- headers = {
- 'Ocp-Apim-Subscription-Key': subscription_key,
- 'Ocp-Apim-Subscription-Region': location,
- 'Content-type': 'application/json',
- 'X-ClientTraceId': str(uuid.uuid4())
- }
-
- # You can pass more than one object in body.
- body = [{
- 'text' : text_input
- }]
- response = requests.post(constructed_url, headers=headers, json=body)
- return response.json()
- ```
-3. Add your Translator subscription key and save.
-
-### Add a route to `app.py`
-
-Next, you'll need to create a route in your Flask app that calls `translate.py`. This route will be called each time a user presses the translate button in your app.
-
-For this app, your route is going to accept `POST` requests. This is because the function expects the text to translate and an output language for the translation.
-
-Flask provides helper functions to help you parse and manage each request. In the code provided, `get_json()` returns the data from the `POST` request as JSON. Then using `data['text']` and `data['to']`, the text and output language values are passed to `get_translation()` function available from `translate.py`. The last step is to return the response as JSON, since you'll need to display this data in your web app.
-
-In the following sections, you'll repeat this process as you create routes for sentiment analysis and speech synthesis.
-
-1. Open `app.py` and locate the import statement at the top of `app.py` and add the following line:
-
- ```python
- import translate
- ```
- Now our Flask app can use the method available via `translate.py`.
-
-2. Copy this code to the end of `app.py` and save:
-
- ```python
- @app.route('/translate-text', methods=['POST'])
- def translate_text():
- data = request.get_json()
- text_input = data['text']
- translation_output = data['to']
- response = translate.get_translation(text_input, translation_output)
- return jsonify(response)
- ```
-
-### Update `https://docsupdatetracker.net/index.html`
-
-Now that you have a function to translate text, and a route in your Flask app to call it, the next step is to start building the HTML for your app. The HTML below does a few things:
-
-* Provides a text area where users can input text to translate.
-* Includes a language selector.
-* Includes HTML elements to render the detected language and confidence scores returned during translation.
-* Provides a read-only text area where the translation output is displayed.
-* Includes placeholders for sentiment analysis and speech synthesis code that you'll add to this file later in the tutorial.
-
-Let's update `https://docsupdatetracker.net/index.html`.
-
-1. Open `https://docsupdatetracker.net/index.html` and locate these code comments:
- ```html
- <!-- HTML provided in the following sections goes here. -->
-
- <!-- End -->
- ```
-
-2. Replace the code comments with this HTML block:
- ```html
- <div class="row">
- <div class="col">
- <form>
- <!-- Enter text to translate. -->
- <div class="form-group">
- <label for="text-to-translate"><strong>Enter the text you'd like to translate:</strong></label>
- <textarea class="form-control" id="text-to-translate" rows="5"></textarea>
- </div>
- <!-- Select output language. -->
- <div class="form-group">
- <label for="select-language"><strong>Translate to:</strong></label>
- <select class="form-control" id="select-language">
- <option value="ar">Arabic</option>
- <option value="ca">Catalan</option>
- <option value="zh-Hans">Chinese (Simplified)</option>
- <option value="zh-Hant">Chinese (Traditional)</option>
- <option value="hr">Croatian</option>
- <option value="en">English</option>
- <option value="fr">French</option>
- <option value="de">German</option>
- <option value="el">Greek</option>
- <option value="he">Hebrew</option>
- <option value="hi">Hindi</option>
- <option value="it">Italian</option>
- <option value="ja">Japanese</option>
- <option value="ko">Korean</option>
- <option value="pt">Portuguese</option>
- <option value="ru">Russian</option>
- <option value="es">Spanish</option>
- <option value="th">Thai</option>
- <option value="tr">Turkish</option>
- <option value="vi">Vietnamese</option>
- </select>
- </div>
- <button type="submit" class="btn btn-primary mb-2" id="translate">Translate text</button></br>
- <div id="detected-language" style="display: none">
- <strong>Detected language:</strong> <span id="detected-language-result"></span><br />
- <strong>Detection confidence:</strong> <span id="confidence"></span><br /><br />
- </div>
-
- <!-- Start sentiment code-->
-
- <!-- End sentiment code -->
-
- </form>
- </div>
- <div class="col">
- <!-- Translated text returned by the Translate API is rendered here. -->
- <form>
- <div class="form-group" id="translator-text-response">
- <label for="translation-result"><strong>Translated text:</strong></label>
- <textarea readonly class="form-control" id="translation-result" rows="5"></textarea>
- </div>
-
- <!-- Start voice font selection code -->
-
- <!-- End voice font selection code -->
-
- </form>
-
- <!-- Add Speech Synthesis button and audio element -->
-
- <!-- End Speech Synthesis button -->
-
- </div>
- </div>
- ```
-
-The next step is to write some JavaScript. This is the bridge between your HTML and Flask route.
-
-### Create `main.js`
-
-The `main.js` file is the bridge between your HTML and Flask route. Your app will use a combination of jQuery, Ajax, and XMLHttpRequest to render content, and make `POST` requests to your Flask routes.
-
-In the code below, content from the HTML is used to construct a request to your Flask route. Specifically, the contents of the text area and the language selector are assigned to variables, and then passed along in the request to `translate-text`.
-
-The code then iterates through the response, and updates the HTML with the translation, detected language, and confidence score.
-
-1. From your IDE, create a file named `main.js` in the `static/scripts` directory.
-2. Copy this code into `static/scripts/main.js`:
- ```javascript
- //Initiate jQuery on load.
- $(function() {
- //Translate text with flask route
- $("#translate").on("click", function(e) {
- e.preventDefault();
- var translateVal = document.getElementById("text-to-translate").value;
- var languageVal = document.getElementById("select-language").value;
- var translateRequest = { 'text': translateVal, 'to': languageVal }
-
- if (translateVal !== "") {
- $.ajax({
- url: '/translate-text',
- method: 'POST',
- headers: {
- 'Content-Type':'application/json'
- },
- dataType: 'json',
- data: JSON.stringify(translateRequest),
- success: function(data) {
- for (var i = 0; i < data.length; i++) {
- document.getElementById("translation-result").textContent = data[i].translations[0].text;
- document.getElementById("detected-language-result").textContent = data[i].detectedLanguage.language;
- if (document.getElementById("detected-language-result").textContent !== ""){
- document.getElementById("detected-language").style.display = "block";
- }
- document.getElementById("confidence").textContent = data[i].detectedLanguage.score;
- }
- }
- });
- };
- });
- // In the following sections, you'll add code for sentiment analysis and
- // speech synthesis here.
- })
- ```
-
-### Test translation
-
-Let's test translation in the app.
-
-```
-flask run
-```
-
-Navigate to the provided server address. Type text into the input area, select a language, and press translate. You should get a translation. If it doesn't work, make sure that you've added your subscription key.
-
-> [!TIP]
-> If the changes you've made aren't showing up, or the app doesn't work the way you expect it to, try clearing your cache or opening a private/incognito window.
-
-Press **CTRL + c** to kill the app, then head to the next section.
-
-## Analyze sentiment
-
-The [Language service API](../language-service/overview.md) can be used to perform sentiment analysis, extract key phrases from text, or detect the source language. In this app, we're going to use sentiment analysis to determine if the provided text is positive, neutral, or negative. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, and scores close to 0 indicate negative sentiment.
-
-In this section, you're going to do a few things:
-
-* Write some Python to call the Langauge service API to perform sentiment analysis and return a response
-* Create a Flask route to call your Python code
-* Update the HTML with an area for sentiment scores, and a button to perform analysis
-* Write JavaScript that allows users to interact with your Flask app from the HTML
-
-### Call the Language service API
-
-Let's write a function to call the Language service API. This function will take four arguments: `input_text`, `input_language`, `output_text`, and `output_language`. This function is called whenever a user presses the run sentiment analysis button in your app. Data provided by the user from the text area and language selector, as well as the detected language and translation output are provided with each request. The response object includes sentiment scores for the source and translation. In the following sections, you're going to write some JavaScript to parse the response and use it in your app. For now, let's focus on call the Language service API.
-
-1. Let's create a file called `sentiment.py` in the root of your working directory.
-2. Next, add this code to `sentiment.py`.
- ```python
- import os, requests, uuid, json
-
- # Don't forget to replace with your Cog Services subscription key!
- subscription_key = 'YOUR_TEXT_ANALYTICS_SUBSCRIPTION_KEY'
- endpoint = "YOUR_TEXT_ANALYTICS_ENDPOINT"
- # Our Flask route will supply four arguments: input_text, input_language,
- # output_text, output_language.
- # When the run sentiment analysis button is pressed in our Flask app,
- # the Ajax request will grab these values from our web app, and use them
- # in the request. See main.js for Ajax calls.
-
- def get_sentiment(input_text, input_language):
- path = '/text/analytics/v3.0/sentiment'
- constructed_url = endpoint + path
-
- headers = {
- 'Ocp-Apim-Subscription-Key': subscription_key,
- 'Content-type': 'application/json',
- 'X-ClientTraceId': str(uuid.uuid4())
- }
-
- # You can pass more than one object in body.
- body = {
- 'documents': [
- {
- 'language': input_language,
- 'id': '1',
- 'text': input_text
- },
- ]
- }
- response = requests.post(constructed_url, headers=headers, json=body)
- return response.json()
- ```
-3. Add your Language service subscription key and save.
-
-### Add a route to `app.py`
-
-Let's create a route in your Flask app that calls `sentiment.py`. This route will be called each time a user presses the run sentiment analysis button in your app. Like the route for translation, this route is going to accept `POST` requests since the function expects arguments.
-
-1. Open `app.py` and locate the import statement at the top of `app.py` and update it:
-
- ```python
- import translate, sentiment
- ```
- Now our Flask app can use the method available via `sentiment.py`.
-
-2. Copy this code to the end of `app.py` and save:
- ```python
- @app.route('/sentiment-analysis', methods=['POST'])
- def sentiment_analysis():
- data = request.get_json()
- input_text = data['inputText']
- input_lang = data['inputLanguage']
- response = sentiment.get_sentiment(input_text, input_lang)
- return jsonify(response)
- ```
-
-### Update `https://docsupdatetracker.net/index.html`
-
-Now that you have a function to run sentiment analysis, and a route in your Flask app to call it, the next step is to start writing the HTML for your app. The HTML below does a few things:
-
-* Adds a button to your app to run sentiment analysis
-* Adds an element that explains sentiment scoring
-* Adds an element to display the sentiment scores
-
-1. Open `https://docsupdatetracker.net/index.html` and locate these code comments:
- ```html
- <!-- Start sentiment code-->
-
- <!-- End sentiment code -->
- ```
-
-2. Replace the code comments with this HTML block:
- ```html
- <button type="submit" class="btn btn-primary mb-2" id="sentiment-analysis">Run sentiment analysis</button></br>
- <div id="sentiment" style="display: none">
- <p>Sentiment can be labeled as "positive", "negative", "neutral", or "mixed". </p>
- <strong>Sentiment label for input:</strong> <span id="input-sentiment"></span><br />
- </div>
- ```
-
-### Update `main.js`
-
-In the code below, content from the HTML is used to construct a request to your Flask route. Specifically, the contents of the text area and the language selector are assigned to variables, and then passed along in the request to the `sentiment-analysis` route.
-
-The code then iterates through the response, and updates the HTML with the sentiment scores.
-
-1. From your IDE, create a file named `main.js` in the `static` directory.
-
-2. Copy this code into `static/scripts/main.js`:
- ```javascript
- //Run sentiment analysis on input and translation.
- $("#sentiment-analysis").on("click", function(e) {
- e.preventDefault();
- var inputText = document.getElementById("text-to-translate").value;
- var inputLanguage = document.getElementById("detected-language-result").innerHTML;
- var outputText = document.getElementById("translation-result").value;
- var outputLanguage = document.getElementById("select-language").value;
-
- var sentimentRequest = { "inputText": inputText, "inputLanguage": inputLanguage};
-
- if (inputText !== "") {
- $.ajax({
- url: "/sentiment-analysis",
- method: "POST",
- headers: {
- "Content-Type":"application/json"
- },
- dataType: "json",
- data: JSON.stringify(sentimentRequest),
- success: function(data) {
- for (var i = 0; i < data.documents.length; i++) {
- if (typeof data.documents[i] !== "undefined"){
- if (data.documents[i].id === "1") {
- document.getElementById("input-sentiment").textContent = data.documents[i].sentiment;
- }
- }
- }
- for (var i = 0; i < data.errors.length; i++) {
- if (typeof data.errors[i] !== "undefined"){
- if (data.errors[i].id === "1") {
- document.getElementById("input-sentiment").textContent = data.errors[i].message;
- }
- }
- }
- if (document.getElementById("input-sentiment").textContent !== ''){
- document.getElementById("sentiment").style.display = "block";
- }
- }
- });
- }
- });
- // In the next section, you'll add code for speech synthesis here.
- ```
-
-### Test sentiment analysis
-
-Let's test sentiment analysis in the app.
-
-```
-flask run
-```
-
-Navigate to the provided server address. Type text into the input area, select a language, and press translate. You should get a translation. Next, press the run sentiment analysis button. You should see two scores. If it doesn't work, make sure that you've added your subscription key.
-
-> [!TIP]
-> If the changes you've made aren't showing up, or the app doesn't work the way you expect it to, try clearing your cache or opening a private/incognito window.
-
-Press **CTRL + c** to kill the app, then head to the next section.
-
-## Convert text-to-speech
-
-The [Text-to-speech API](../speech-service/text-to-speech.md) enables your app to convert text into natural human-like synthesized speech. The service supports standard, neural, and custom voices. Our sample app uses a handful of the available voices, for a full list, see [supported languages](../speech-service/language-support.md#text-to-speech).
-
-In this section, you're going to do a few things:
-
-* Write some Python to convert text-to-speech with the Text-to-speech API
-* Create a Flask route to call your Python code
-* Update the HTML with a button to convert text-to-speech, and an element for audio playback
-* Write JavaScript that allows users to interact with your Flask app
-
-### Call the Text-to-Speech API
-
-Let's write a function to convert text-to-speech. This function will take two arguments: `input_text` and `voice_font`. This function is called whenever a user presses the convert text-to-speech button in your app. `input_text` is the translation output returned by the call to translate text, `voice_font` is the value from the voice font selector in the HTML.
-
-1. Let's create a file called `synthesize.py` in the root of your working directory.
-
-2. Next, add this code to `synthesize.py`.
- ```Python
- import os, requests, time
- from xml.etree import ElementTree
-
- class TextToSpeech(object):
- def __init__(self, input_text, voice_font):
- subscription_key = 'YOUR_SPEECH_SERVICES_SUBSCRIPTION_KEY'
- self.subscription_key = subscription_key
- self.input_text = input_text
- self.voice_font = voice_font
- self.timestr = time.strftime('%Y%m%d-%H%M')
- self.access_token = None
-
- # This function performs the token exchange.
- def get_token(self):
- fetch_token_url = 'https://westus.api.cognitive.microsoft.com/sts/v1.0/issueToken'
- headers = {
- 'Ocp-Apim-Subscription-Key': self.subscription_key
- }
- response = requests.post(fetch_token_url, headers=headers)
- self.access_token = str(response.text)
-
- # This function calls the TTS endpoint with the access token.
- def save_audio(self):
- base_url = 'https://westus.tts.speech.microsoft.com/'
- path = 'cognitiveservices/v1'
- constructed_url = base_url + path
- headers = {
- 'Authorization': 'Bearer ' + self.access_token,
- 'Content-Type': 'application/ssml+xml',
- 'X-Microsoft-OutputFormat': 'riff-24khz-16bit-mono-pcm',
- 'User-Agent': 'YOUR_RESOURCE_NAME',
- }
- # Build the SSML request with ElementTree
- xml_body = ElementTree.Element('speak', version='1.0')
- xml_body.set('{http://www.w3.org/XML/1998/namespace}lang', 'en-us')
- voice = ElementTree.SubElement(xml_body, 'voice')
- voice.set('{http://www.w3.org/XML/1998/namespace}lang', 'en-US')
- voice.set('name', 'Microsoft Server Speech Text to Speech Voice {}'.format(self.voice_font))
- voice.text = self.input_text
- # The body must be encoded as UTF-8 to handle non-ascii characters.
- body = ElementTree.tostring(xml_body, encoding="utf-8")
-
- #Send the request
- response = requests.post(constructed_url, headers=headers, data=body)
-
- # Write the response as a wav file for playback. The file is located
- # in the same directory where this sample is run.
- return response.content
- ```
-3. Add your Speech Services subscription key and save.
-
-### Add a route to `app.py`
-
-Let's create a route in your Flask app that calls `synthesize.py`. This route will be called each time a user presses the convert text-to-speech button in your app. Like the routes for translation and sentiment analysis, this route is going to accept `POST` requests since the function expects two arguments: the text to synthesize, and the voice font for playback.
-
-1. Open `app.py` and locate the import statement at the top of `app.py` and update it:
-
- ```python
- import translate, sentiment, synthesize
- ```
- Now our Flask app can use the method available via `synthesize.py`.
-
-2. Copy this code to the end of `app.py` and save:
-
- ```Python
- @app.route('/text-to-speech', methods=['POST'])
- def text_to_speech():
- data = request.get_json()
- text_input = data['text']
- voice_font = data['voice']
- tts = synthesize.TextToSpeech(text_input, voice_font)
- tts.get_token()
- audio_response = tts.save_audio()
- return audio_response
- ```
-
-### Update `https://docsupdatetracker.net/index.html`
-
-Now that you have a function to convert text-to-speech, and a route in your Flask app to call it, the next step is to start writing the HTML for your app. The HTML below does a few things:
-
-* Provides a voice selection drop-down
-* Adds a button to convert text-to-speech
-* Adds an audio element, which is used to play back the synthesized speech
-
-1. Open `https://docsupdatetracker.net/index.html` and locate these code comments:
- ```html
- <!-- Start voice font selection code -->
-
- <!-- End voice font selection code -->
- ```
-
-2. Replace the code comments with this HTML block:
- ```html
- <div class="form-group">
- <label for="select-voice"><strong>Select voice font:</strong></label>
- <select class="form-control" id="select-voice">
- <option value="(ar-SA, Naayf)">Arabic | Male | Naayf</option>
- <option value="(ca-ES, HerenaRUS)">Catalan | Female | HerenaRUS</option>
- <option value="(zh-CN, HuihuiRUS)">Chinese (Mainland) | Female | HuihuiRUS</option>
- <option value="(zh-CN, Kangkang, Apollo)">Chinese (Mainland) | Male | Kangkang, Apollo</option>
- <option value="(zh-HK, Tracy, Apollo)">Chinese (Hong Kong)| Female | Tracy, Apollo</option>
- <option value="(zh-HK, Danny, Apollo)">Chinese (Hong Kong) | Male | Danny, Apollo</option>
- <option value="(zh-TW, Yating, Apollo)">Chinese (Taiwan)| Female | Yating, Apollo</option>
- <option value="(zh-TW, Zhiwei, Apollo)">Chinese (Taiwan) | Male | Zhiwei, Apollo</option>
- <option value="(hr-HR, Matej)">Croatian | Male | Matej</option>
- <option value="(en-US, AriaRUS)">English (US) | Female | AriaRUS</option>
- <option value="(en-US, Guy24kRUS)">English (US) | Male | Guy24kRUS</option>
- <option value="(en-IE, Sean)">English (IE) | Male | Sean</option>
- <option value="(fr-FR, Julie, Apollo)">French | Female | Julie, Apollo</option>
- <option value="(fr-FR, HortenseRUS)">French | Female | Julie, HortenseRUS</option>
- <option value="(fr-FR, Paul, Apollo)">French | Male | Paul, Apollo</option>
- <option value="(de-DE, Hedda)">German | Female | Hedda</option>
- <option value="(de-DE, HeddaRUS)">German | Female | HeddaRUS</option>
- <option value="(de-DE, Stefan, Apollo)">German | Male | Apollo</option>
- <option value="(el-GR, Stefanos)">Greek | Male | Stefanos</option>
- <option value="(he-IL, Asaf)">Hebrew (Isreal) | Male | Asaf</option>
- <option value="(hi-IN, Kalpana, Apollo)">Hindi | Female | Kalpana, Apollo</option>
- <option value="(hi-IN, Hemant)">Hindi | Male | Hemant</option>
- <option value="(it-IT, LuciaRUS)">Italian | Female | LuciaRUS</option>
- <option value="(it-IT, Cosimo, Apollo)">Italian | Male | Cosimo, Apollo</option>
- <option value="(ja-JP, Ichiro, Apollo)">Japanese | Male | Ichiro</option>
- <option value="(ja-JP, HarukaRUS)">Japanese | Female | HarukaRUS</option>
- <option value="(ko-KR, HeamiRUS)">Korean | Female | Heami</option>
- <option value="(pt-BR, HeloisaRUS)">Portuguese (Brazil) | Female | HeloisaRUS</option>
- <option value="(pt-BR, Daniel, Apollo)">Portuguese (Brazil) | Male | Daniel, Apollo</option>
- <option value="(pt-PT, HeliaRUS)">Portuguese (Portugal) | Female | HeliaRUS</option>
- <option value="(ru-RU, Irina, Apollo)">Russian | Female | Irina, Apollo</option>
- <option value="(ru-RU, Pavel, Apollo)">Russian | Male | Pavel, Apollo</option>
- <option value="(ru-RU, EkaterinaRUS)">Russian | Female | EkaterinaRUS</option>
- <option value="(es-ES, Laura, Apollo)">Spanish | Female | Laura, Apollo</option>
- <option value="(es-ES, HelenaRUS)">Spanish | Female | HelenaRUS</option>
- <option value="(es-ES, Pablo, Apollo)">Spanish | Male | Pablo, Apollo</option>
- <option value="(th-TH, Pattara)">Thai | Male | Pattara</option>
- <option value="(tr-TR, SedaRUS)">Turkish | Female | SedaRUS</option>
- <option value="(vi-VN, An)">Vietnamese | Male | An</option>
- </select>
- </div>
- ```
-
-3. Next, locate these code comments:
- ```html
- <!-- Add Speech Synthesis button and audio element -->
-
- <!-- End Speech Synthesis button -->
- ```
-
-4. Replace the code comments with this HTML block:
-
-```html
-<button type="submit" class="btn btn-primary mb-2" id="text-to-speech">Convert text-to-speech</button>
-<div id="audio-playback">
- <audio id="audio" controls>
- <source id="audio-source" type="audio/mpeg" />
- </audio>
-</div>
-```
-
-5. Make sure to save your work.
-
-### Update `main.js`
-
-In the code below, content from the HTML is used to construct a request to your Flask route. Specifically, the translation and the voice font are assigned to variables, and then passed along in the request to the `text-to-speech` route.
-
-The code then iterates through the response, and updates the HTML with the sentiment scores.
-
-1. From your IDE, create a file named `main.js` in the `static` directory.
-2. Copy this code into `static/scripts/main.js`:
- ```javascript
- // Convert text-to-speech
- $("#text-to-speech").on("click", function(e) {
- e.preventDefault();
- var ttsInput = document.getElementById("translation-result").value;
- var ttsVoice = document.getElementById("select-voice").value;
- var ttsRequest = { 'text': ttsInput, 'voice': ttsVoice }
-
- var xhr = new XMLHttpRequest();
- xhr.open("post", "/text-to-speech", true);
- xhr.setRequestHeader("Content-Type", "application/json");
- xhr.responseType = "blob";
- xhr.onload = function(evt){
- if (xhr.status === 200) {
- audioBlob = new Blob([xhr.response], {type: "audio/mpeg"});
- audioURL = URL.createObjectURL(audioBlob);
- if (audioURL.length > 5){
- var audio = document.getElementById("audio");
- var source = document.getElementById("audio-source");
- source.src = audioURL;
- audio.load();
- audio.play();
- }else{
- console.log("An error occurred getting and playing the audio.")
- }
- }
- }
- xhr.send(JSON.stringify(ttsRequest));
- });
- // Code for automatic language selection goes here.
- ```
-3. You're almost done. The last thing you're going to do is add some code to `main.js` to automatically select a voice font based on the language selected for translation. Add this code block to `main.js`:
- ```javascript
- // Automatic voice font selection based on translation output.
- $('select[id="select-language"]').change(function(e) {
- if ($(this).val() == "ar"){
- document.getElementById("select-voice").value = "(ar-SA, Naayf)";
- }
- if ($(this).val() == "ca"){
- document.getElementById("select-voice").value = "(ca-ES, HerenaRUS)";
- }
- if ($(this).val() == "zh-Hans"){
- document.getElementById("select-voice").value = "(zh-HK, Tracy, Apollo)";
- }
- if ($(this).val() == "zh-Hant"){
- document.getElementById("select-voice").value = "(zh-HK, Tracy, Apollo)";
- }
- if ($(this).val() == "hr"){
- document.getElementById("select-voice").value = "(hr-HR, Matej)";
- }
- if ($(this).val() == "en"){
- document.getElementById("select-voice").value = "(en-US, Jessa24kRUS)";
- }
- if ($(this).val() == "fr"){
- document.getElementById("select-voice").value = "(fr-FR, HortenseRUS)";
- }
- if ($(this).val() == "de"){
- document.getElementById("select-voice").value = "(de-DE, HeddaRUS)";
- }
- if ($(this).val() == "el"){
- document.getElementById("select-voice").value = "(el-GR, Stefanos)";
- }
- if ($(this).val() == "he"){
- document.getElementById("select-voice").value = "(he-IL, Asaf)";
- }
- if ($(this).val() == "hi"){
- document.getElementById("select-voice").value = "(hi-IN, Kalpana, Apollo)";
- }
- if ($(this).val() == "it"){
- document.getElementById("select-voice").value = "(it-IT, LuciaRUS)";
- }
- if ($(this).val() == "ja"){
- document.getElementById("select-voice").value = "(ja-JP, HarukaRUS)";
- }
- if ($(this).val() == "ko"){
- document.getElementById("select-voice").value = "(ko-KR, HeamiRUS)";
- }
- if ($(this).val() == "pt"){
- document.getElementById("select-voice").value = "(pt-BR, HeloisaRUS)";
- }
- if ($(this).val() == "ru"){
- document.getElementById("select-voice").value = "(ru-RU, EkaterinaRUS)";
- }
- if ($(this).val() == "es"){
- document.getElementById("select-voice").value = "(es-ES, HelenaRUS)";
- }
- if ($(this).val() == "th"){
- document.getElementById("select-voice").value = "(th-TH, Pattara)";
- }
- if ($(this).val() == "tr"){
- document.getElementById("select-voice").value = "(tr-TR, SedaRUS)";
- }
- if ($(this).val() == "vi"){
- document.getElementById("select-voice").value = "(vi-VN, An)";
- }
- });
- ```
-
-### Test your app
-
-Let's test speech synthesis in the app.
-
-```
-flask run
-```
-
-Navigate to the provided server address. Type text into the input area, select a language, and press translate. You should get a translation. Next, select a voice, then press the convert text-to-speech button. the translation should be played back as synthesized speech. If it doesn't work, make sure that you've added your subscription key.
-
-> [!TIP]
-> If the changes you've made aren't showing up, or the app doesn't work the way you expect it to, try clearing your cache or opening a private/incognito window.
-
-That's it, you have a working app that performs translations, analyzes sentiment, and synthesized speech. Press **CTRL + c** to kill the app. Be sure to check out the other [Azure Cognitive Services](../index.yml).
-
-## Get the source code
-
-The source code for this project is available on [GitHub](https://github.com/MicrosoftTranslator/Text-Translation-API-V3-Flask-App-Tutorial).
-
-## Next steps
-
-* [Translator reference](./reference/v3-0-reference.md)
-* [Language service API reference](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1)
-* [Text-to-speech API reference](../speech-service/rest-text-to-speech.md)
cognitive-services Tutorial Wpf Translation Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/tutorial-wpf-translation-csharp.md
- Title: "Tutorial: Create a translation app with WPF, C# - Translator"-
-description: In this tutorial, you'll create a WPF app to perform text translation, language detection, and spell checking with a single subscription key.
------ Previously updated : 05/26/2020----
-# Tutorial: Create a translation app with WPF
-
-In this tutorial, you'll build a [Windows Presentation Foundation (WPF)](/visualstudio/designers/getting-started-with-wpf) app that uses Azure Cognitive Services for text translation, language detection, and spell checking with a single subscription key. Specifically, your app will call APIs from the Translator and [Bing Spell Check](https://azure.microsoft.com/services/cognitive-services/spell-check/).
-
-What is WPF? It's a UI framework that creates desktop client apps. The WPF development platform supports a broad set of app development features, including an app model, resources, controls, graphics, layout, data binding, documents, and security. It's a subset of the .NET Framework, so if you have previously built apps with the .NET Framework using ASP.NET or Windows Forms, the programming experience should be familiar. WPF uses the Extensible app Markup Language (XAML) to provide a declarative model for app programming, which we'll review in the coming sections.
-
-In this tutorial, you'll learn how to:
-
-> [!div class="checklist"]
-> * Create a WPF project in Visual Studio
-> * Add assemblies and NuGet packages to your project
-> * Create your app's UI with XAML
-> * Use the Translator to get languages, translate text, and detect the source language
-> * Use the Bing Spell Check API to validate your input and improve translation accuracy
-> * Run your WPF app
-
-### Cognitive Services used in this tutorial
-
-This list includes the Cognitive Services used in this tutorial. Follow the link to browse the API reference for each feature.
-
-| Service | Feature | Description |
-|||-|
-| Translator | [Get Languages](./reference/v3-0-languages.md) | Retrieve a complete list of supported languages for text translation. |
-| Translator | [Translate](./reference/v3-0-translate.md) | Translate text. |
-| Translator | [Detect](./reference/v3-0-detect.md) | Detect the language of the input text. Includes confidence score for detection. |
-| Bing Spell Check | [Spell Check](/rest/api/cognitiveservices/bing-spell-check-api-v7-reference) | Correct spelling errors to improve translation accuracy. |
-
-## Prerequisites
-
-Before we continue, you'll need the following:
-
-* An Azure Cognitive Services subscription. [Get a Cognitive Services key](../cognitive-services-apis-create-account.md#create-a-new-azure-cognitive-services-resource).
-* A Windows machine
-* [Visual Studio 2019](https://www.visualstudio.com/downloads/) - Community or Enterprise
-
-> [!NOTE]
-> We recommend creating the subscription in the West US region for this tutorial. Otherwise, you'll need to change endpoints and regions in the code as you work through this exercise.
-
-## Create a WPF app in Visual Studio
-
-The first thing we need to do is set up our project in Visual Studio.
-
-1. Open Visual Studio. Select **Create a new project**.
-1. In **Create a new project**, locate and select **WPF App (.NET Framework)**. You can select C# from **Language** to narrow the options.
-1. Select **Next**, and then name your project `MSTranslatorDemo`.
-1. Set the framework version to **.NET Framework 4.7.2** or later, and select **Create**.
- ![Enter the name and framework version in Visual Studio](media/name-wpf-project-visual-studio.png)
-
-Your project has been created. You'll notice that there are two tabs open: `MainWindow.xaml` and `MainWindow.xaml.cs`. Throughout this tutorial, we'll be adding code to these two files. We'll modify `MainWindow.xaml` for the app's user interface. We'll modify `MainWindow.xaml.cs` for our calls to Translator and Bing Spell Check.
- ![Review your environment](media/blank-wpf-project.png)
-
-In the next section, we're going to add assemblies and a NuGet package to our project for additional functionality, like JSON parsing.
-
-## Add references and NuGet packages to your project
-
-Our project requires a handful of .NET Framework assemblies and NewtonSoft.Json, which we'll install using the NuGet package manager.
-
-### Add .NET Framework assemblies
-
-Let's add assemblies to our project to serialize and deserialize objects, and to manage HTTP requests and responses.
-
-1. Locate your project in Visual Studio's Solution Explorer. Right-click your project, then select **Add > Reference**, which opens **Reference Manager**.
-1. The **Assemblies** tab lists all .NET Framework assemblies that are available to reference. Use the search bar in the upper right to search for references.
- ![Add assembly references](media/add-assemblies-2019.png)
-1. Select the following references for your project:
- * [System.Runtime.Serialization](/dotnet/api/system.runtime.serialization)
- * [System.Web](/dotnet/api/system.web)
- * System.Web.Extensions
- * [System.Windows](/dotnet/api/system.windows)
-1. After you've added these references to your project, you can click **OK** to close **Reference Manager**.
-
-> [!NOTE]
-> If you'd like to learn more about assembly references, see [How to: Add or remove reference using the Reference Manager](/visualstudio/ide/how-to-add-or-remove-references-by-using-the-reference-manager).
-
-### Install NewtonSoft.Json
-
-Our app will use NewtonSoft.Json to deserialize JSON objects. Follow these instructions to install the package.
-
-1. Locate your project in Visual Studio's Solution Explorer and right-click on your project. Select **Manage NuGet Packages**.
-1. Locate and select the **Browse** tab.
-1. Enter [NewtonSoft.Json](https://www.nuget.org/packages/Newtonsoft.Json/) into the search bar.
-
- ![Locate and install NewtonSoft.Json](media/nuget-package-manager.png)
-
-1. Select the package and click **Install**.
-1. When the installation is complete, close the tab.
-
-## Create a WPF form using XAML
-
-To use your app, you're going to need a user interface. Using XAML, we'll create a form that allows users to select input and translation languages, enter text to translate, and displays the translation output.
-
-Let's take a look at what we're building.
-
-![WPF XAML user interface](media/translator-text-csharp-xaml.png)
-
-The user interface includes these components:
-
-| Name | Type | Description |
-|||-|
-| `FromLanguageComboBox` | ComboBox | Displays a list of the languages supported by Microsoft Translator for text translation. The user selects the language they are translating from. |
-| `ToLanguageComboBox` | ComboBox | Displays the same list of languages as `FromComboBox`, but is used to select the language the user is translating to. |
-| `TextToTranslate` | TextBox | Allows the user to enter text to be translated. |
-| `TranslateButton` | Button | Use this button to translate text. |
-| `TranslatedTextLabel` | Label | Displays the translation. |
-| `DetectedLanguageLabel` | Label | Displays the detected language of the text to be translated (`TextToTranslate`). |
-
-> [!NOTE]
-> We're creating this form using the XAML source code, however, you can create the form with the editor in Visual Studio.
-
-Let's add the code to our project.
-
-1. In Visual Studio, select the tab for `MainWindow.xaml`.
-1. Copy this code into your project, and then select **File > Save MainWindow.xaml** to save your changes.
- ```xaml
- <Window x:Class="MSTranslatorDemo.MainWindow"
- xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
- xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
- xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
- xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
- xmlns:local="clr-namespace:MSTranslatorDemo"
- mc:Ignorable="d"
- Title="Microsoft Translator" Height="400" Width="700" BorderThickness="0">
- <Grid>
- <Label x:Name="label" Content="Microsoft Translator" HorizontalAlignment="Left" Margin="39,6,0,0" VerticalAlignment="Top" Height="49" FontSize="26.667"/>
- <TextBox x:Name="TextToTranslate" HorizontalAlignment="Left" Height="23" Margin="42,160,0,0" TextWrapping="Wrap" VerticalAlignment="Top" Width="600" FontSize="14" TabIndex="3"/>
- <Label x:Name="EnterTextLabel" Content="Text to translate:" HorizontalAlignment="Left" Margin="40,129,0,0" VerticalAlignment="Top" FontSize="14"/>
- <Label x:Name="toLabel" Content="Translate to:" HorizontalAlignment="Left" Margin="304,58,0,0" VerticalAlignment="Top" FontSize="14"/>
-
- <Button x:Name="TranslateButton" Content="Translate" HorizontalAlignment="Left" Margin="39,206,0,0" VerticalAlignment="Top" Width="114" Height="31" Click="TranslateButton_Click" FontSize="14" TabIndex="4" IsDefault="True"/>
- <ComboBox x:Name="ToLanguageComboBox"
- HorizontalAlignment="Left"
- Margin="306,88,0,0"
- VerticalAlignment="Top"
- Width="175" FontSize="14" TabIndex="2">
-
- </ComboBox>
- <Label x:Name="fromLabel" Content="Translate from:" HorizontalAlignment="Left" Margin="40,58,0,0" VerticalAlignment="Top" FontSize="14"/>
- <ComboBox x:Name="FromLanguageComboBox"
- HorizontalAlignment="Left"
- Margin="42,88,0,0"
- VerticalAlignment="Top"
- Width="175" FontSize="14" TabIndex="1"/>
- <Label x:Name="TranslatedTextLabel" Content="Translation is displayed here." HorizontalAlignment="Left" Margin="39,255,0,0" VerticalAlignment="Top" Width="620" FontSize="14" Height="85" BorderThickness="0"/>
- <Label x:Name="DetectedLanguageLabel" Content="Autodetected language is displayed here." HorizontalAlignment="Left" Margin="39,288,0,0" VerticalAlignment="Top" Width="620" FontSize="14" Height="84" BorderThickness="0"/>
- </Grid>
- </Window>
- ```
-You should now see a preview of the app's user interface in Visual Studio. It should look similar to the image above.
-
-That's it, your form is ready. Now let's write some code to use Text Translation and Bing Spell Check.
-
-> [!NOTE]
-> Feel free to tweak this form or create your own.
-
-## Create your app
-
-`MainWindow.xaml.cs` contains the code that controls our app. In the next few sections, we're going to add code to populate our drop-down menus, and to call a handful of API exposed by Translator and Bing Spell Check.
-
-* When the program starts and `MainWindow` is instantiated, the `Languages` method of the Translator is called to retrieve and populate our language selection drop-downs. This happens once at the beginning of each session.
-* When the **Translate** button is clicked, the user's language selection and text are retrieved, spell check is performed on the input, and the translation and detected language are displayed for the user.
- * The `Translate` method of the Translator is called to translate text from `TextToTranslate`. This call also includes the `to` and `from` languages selected using the drop-down menus.
- * The `Detect` method of the Translator is called to determine the text language of `TextToTranslate`.
- * Bing Spell Check is used to validate `TextToTranslate` and adjust misspellings.
-
-All of our project is encapsulated in the `MainWindow : Window` class. Let's start by adding code to set your subscription key, declare endpoints for Translator and Bing Spell Check, and initialize the app.
-
-1. In Visual Studio, select the tab for `MainWindow.xaml.cs`.
-1. Replace the pre-populated `using` statements with the following.
- ```csharp
- using System;
- using System.Windows;
- using System.Net;
- using System.Net.Http;
- using System.IO;
- using System.Collections.Generic;
- using System.Linq;
- using System.Text;
- using Newtonsoft.Json;
- ```
-1. Locate the `MainWindow : Window` class, and replace it with this code:
- ```csharp
- {
- // This sample uses the Cognitive Services subscription key for all services. To learn more about
- // authentication options, see: https://docs.microsoft.com/azure/cognitive-services/authentication.
- const string COGNITIVE_SERVICES_KEY = "YOUR_COG_SERVICES_KEY";
- // Endpoints for Translator and Bing Spell Check
- public static readonly string TEXT_TRANSLATION_API_ENDPOINT = "https://api.cognitive.microsofttranslator.com/{0}?api-version=3.0";
- const string BING_SPELL_CHECK_API_ENDPOINT = "https://westus.api.cognitive.microsoft.com/bing/v7.0/spellcheck/";
- // An array of language codes
- private string[] languageCodes;
-
- // Dictionary to map language codes from friendly name (sorted case-insensitively on language name)
- private SortedDictionary<string, string> languageCodesAndTitles =
- new SortedDictionary<string, string>(Comparer<string>.Create((a, b) => string.Compare(a, b, true)));
-
- // Global exception handler to display error message and exit
- private static void HandleExceptions(object sender, UnhandledExceptionEventArgs args)
- {
- Exception e = (Exception)args.ExceptionObject;
- MessageBox.Show("Caught " + e.Message, "Error", MessageBoxButton.OK, MessageBoxImage.Error);
- System.Windows.Application.Current.Shutdown();
- }
- // MainWindow constructor
- public MainWindow()
- {
- // Display a message if unexpected error is encountered
- AppDomain.CurrentDomain.UnhandledException += new UnhandledExceptionEventHandler(HandleExceptions);
-
- if (COGNITIVE_SERVICES_KEY.Length != 32)
- {
- MessageBox.Show("One or more invalid API subscription keys.\n\n" +
- "Put your keys in the *_API_SUBSCRIPTION_KEY variables in MainWindow.xaml.cs.",
- "Invalid Subscription Key(s)", MessageBoxButton.OK, MessageBoxImage.Error);
- System.Windows.Application.Current.Shutdown();
- }
- else
- {
- // Start GUI
- InitializeComponent();
- // Get languages for drop-downs
- GetLanguagesForTranslate();
- // Populate drop-downs with values from GetLanguagesForTranslate
- PopulateLanguageMenus();
- }
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- }
- ```
-1. Add your Cognitive Services subscription key and save.
-
-In this code block, we've declared two member variables that contain information about available languages for translation:
-
-| Variable | Type | Description |
-|-||-|
-|`languageCodes` | Array of strings |Caches the language codes. The Translator service uses short codes, such as `en` for English, to identify languages. |
-|`languageCodesAndTitles` | Sorted dictionary | Maps the "friendly" names in the user interface back to the short codes used in the API. Kept sorted alphabetically without regard for case. |
-
-Then, within the `MainWindow` constructor, we've added error handling with `HandleExceptions`. This error handling ensures that an alert is provided if an exception isn't handled. Then a check is run to confirm the subscription key provided is 32 characters in length. An error is thrown if the key is less than/greater than 32 characters.
-
-If there are keys that are at least the right length, the `InitializeComponent()` call gets the user interface rolling by locating, loading, and instantiating the XAML description of the main app window.
-
-Last, we've added code to call methods to retrieve languages for translation and to populate the drop-down menus for our app's user interface. Don't worry, we'll get to the code behind these calls soon.
-
-## Get supported languages
-
-We recommend calling the Languages resource exposed by the Translator rather than hardcoding the language list in your app.
-
-In this section, we'll create a `GET` request to the Languages resource, specifying that we want a list of languages available for translation.
-
-> [!NOTE]
-> The Languages resource allows you to filter language support with the following query parameters: transliteration, dictionary, and translation. For more information, see [API reference](./reference/v3-0-languages.md).
-
-Before we go any further, let's take a look at a sample output for a call to the Languages resource:
-
-```json
-{
- "translation": {
- "af": {
- "name": "Afrikaans",
- "nativeName": "Afrikaans",
- "dir": "ltr"
- },
- "ar": {
- "name": "Arabic",
- "nativeName": "العربية",
- "dir": "rtl"
- }
- // Additional languages are provided in the full JSON output.
-}
-```
-
-From this output, we can extract the language code and the `name` of a specific language. Our app uses NewtonSoft.Json to deserialize the JSON object ([`JsonConvert.DeserializeObject`](https://www.newtonsoft.com/json/help/html/M_Newtonsoft_Json_JsonConvert_DeserializeObject__1.htm)).
-
-Picking up where we left off in the last section, let's add a method to get supported languages to our app.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project:
- ```csharp
- // ***** GET TRANSLATABLE LANGUAGE CODES
- private void GetLanguagesForTranslate()
- {
- // Send request to get supported language codes
- string uri = String.Format(TEXT_TRANSLATION_API_ENDPOINT, "languages") + "&scope=translation";
- WebRequest WebRequest = WebRequest.Create(uri);
- WebRequest.Headers.Add("Accept-Language", "en");
- WebResponse response = null;
- // Read and parse the JSON response
- response = WebRequest.GetResponse();
- using (var reader = new StreamReader(response.GetResponseStream(), UnicodeEncoding.UTF8))
- {
- var result = JsonConvert.DeserializeObject<Dictionary<string, Dictionary<string, Dictionary<string, string>>>>(reader.ReadToEnd());
- var languages = result["translation"];
-
- languageCodes = languages.Keys.ToArray();
- foreach (var kv in languages)
- {
- languageCodesAndTitles.Add(kv.Value["name"], kv.Key);
- }
- }
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- ```
-
-The `GetLanguagesForTranslate()` method creates an HTTP GET request, and uses the `scope=translation` query string parameter is used to limit the scope of the request to supported languages for translation. The `Accept-Language` header with the value `en` is added so that the supported languages are returned in English.
-
-The JSON response is parsed and converted to a dictionary. Then the language codes are added to the `languageCodes` member variable. The key/value pairs that contain the language codes and the friendly language names are looped through and added to the `languageCodesAndTitles` member variable. The drop-down menus in the form display the friendly names, but the codes are needed to request the translation.
-
-## Populate language drop-down menus
-
-The user interface is defined using XAML, so you don't need to do much to set it up besides call `InitializeComponent()`. The one thing you need to do is add the friendly language names to the **Translate from** and **Translate to** drop-down menus. The `PopulateLanguageMenus()` method adds the names.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project below the `GetLanguagesForTranslate()` method:
- ```csharp
- private void PopulateLanguageMenus()
- {
- // Add option to automatically detect the source language
- FromLanguageComboBox.Items.Add("Detect");
-
- int count = languageCodesAndTitles.Count;
- foreach (string menuItem in languageCodesAndTitles.Keys)
- {
- FromLanguageComboBox.Items.Add(menuItem);
- ToLanguageComboBox.Items.Add(menuItem);
- }
-
- // Set default languages
- FromLanguageComboBox.SelectedItem = "Detect";
- ToLanguageComboBox.SelectedItem = "English";
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- ```
-
-This method iterates over the `languageCodesAndTitles` dictionary and adds each key to both menus. After the menus are populated, the default from and to languages are set to **Detect** and **English** respectively.
-
-> [!TIP]
-> Without a default selection for the menus, the user can click **Translate** without first choosing a "to" or "from" language. The defaults eliminate the need to deal with this problem.
-
-Now that `MainWindow` has been initialized and the user interface created, this code won't run until the **Translate** button is clicked.
-
-## Detect language of source text
-
-Now we're going to create method to detect the language of the source text (text entered into our text area) using the Translator. The value returned by this request will be used in our translation request later.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project below the `PopulateLanguageMenus()` method:
- ```csharp
- // ***** DETECT LANGUAGE OF TEXT TO BE TRANSLATED
- private string DetectLanguage(string text)
- {
- string detectUri = string.Format(TEXT_TRANSLATION_API_ENDPOINT ,"detect");
-
- // Create request to Detect languages with Translator
- HttpWebRequest detectLanguageWebRequest = (HttpWebRequest)WebRequest.Create(detectUri);
- detectLanguageWebRequest.Headers.Add("Ocp-Apim-Subscription-Key", COGNITIVE_SERVICES_KEY);
- detectLanguageWebRequest.Headers.Add("Ocp-Apim-Subscription-Region", "westus");
- detectLanguageWebRequest.ContentType = "application/json; charset=utf-8";
- detectLanguageWebRequest.Method = "POST";
-
- // Send request
- var serializer = new System.Web.Script.Serialization.JavaScriptSerializer();
- string jsonText = serializer.Serialize(text);
-
- string body = "[{ \"Text\": " + jsonText + " }]";
- byte[] data = Encoding.UTF8.GetBytes(body);
-
- detectLanguageWebRequest.ContentLength = data.Length;
-
- using (var requestStream = detectLanguageWebRequest.GetRequestStream())
- requestStream.Write(data, 0, data.Length);
-
- HttpWebResponse response = (HttpWebResponse)detectLanguageWebRequest.GetResponse();
-
- // Read and parse JSON response
- var responseStream = response.GetResponseStream();
- var jsonString = new StreamReader(responseStream, Encoding.GetEncoding("utf-8")).ReadToEnd();
- dynamic jsonResponse = serializer.DeserializeObject(jsonString);
-
- // Fish out the detected language code
- var languageInfo = jsonResponse[0];
- if (languageInfo["score"] > (decimal)0.5)
- {
- DetectedLanguageLabel.Content = languageInfo["language"];
- return languageInfo["language"];
- }
- else
- return "Unable to confidently detect input language.";
- }
- // NOTE:
- // In the following sections, we'll add code below this.
- ```
-
-This method creates an HTTP `POST` request to the Detect resource. It takes a single argument, `text`, which is passed along as the body of the request. Later, we when we create our translation request, the text entered into our UI will be passed to this method for language detection.
-
-Additionally, this method evaluates the confidence score of the response. If the score is greater than `0.5`, then the detected language is displayed in our user interface.
-
-## Spell check the source text
-
-Now we're going to create a method to spell check our source text using the Bing Spell Check API. Spell checking ensures that we'll get back accurate translations from the Translator. Any corrections to the source text are passed along in our translation request when the **Translate** button is clicked.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-2. Add this code to your project below the `DetectLanguage()` method:
-
-```csharp
-// ***** CORRECT SPELLING OF TEXT TO BE TRANSLATED
-private string CorrectSpelling(string text)
-{
- string uri = BING_SPELL_CHECK_API_ENDPOINT + "?mode=spell&mkt=en-US";
-
- // Create a request to Bing Spell Check API
- HttpWebRequest spellCheckWebRequest = (HttpWebRequest)WebRequest.Create(uri);
- spellCheckWebRequest.Headers.Add("Ocp-Apim-Subscription-Key", COGNITIVE_SERVICES_KEY);
- spellCheckWebRequest.Method = "POST";
- spellCheckWebRequest.ContentType = "application/x-www-form-urlencoded"; // doesn't work without this
-
- // Create and send the request
- string body = "text=" + System.Web.HttpUtility.UrlEncode(text);
- byte[] data = Encoding.UTF8.GetBytes(body);
- spellCheckWebRequest.ContentLength = data.Length;
- using (var requestStream = spellCheckWebRequest.GetRequestStream())
- requestStream.Write(data, 0, data.Length);
- HttpWebResponse response = (HttpWebResponse)spellCheckWebRequest.GetResponse();
-
- // Read and parse the JSON response; get spelling corrections
- var serializer = new System.Web.Script.Serialization.JavaScriptSerializer();
- var responseStream = response.GetResponseStream();
- var jsonString = new StreamReader(responseStream, Encoding.GetEncoding("utf-8")).ReadToEnd();
- dynamic jsonResponse = serializer.DeserializeObject(jsonString);
- var flaggedTokens = jsonResponse["flaggedTokens"];
-
- // Construct sorted dictionary of corrections in reverse order (right to left)
- // This ensures that changes don't impact later indexes
- var corrections = new SortedDictionary<int, string[]>(Comparer<int>.Create((a, b) => b.CompareTo(a)));
- for (int i = 0; i < flaggedTokens.Length; i++)
- {
- var correction = flaggedTokens[i];
- var suggestion = correction["suggestions"][0]; // Consider only first suggestion
- if (suggestion["score"] > (decimal)0.7) // Take it only if highly confident
- corrections[(int)correction["offset"]] = new string[] // dict key = offset
- { correction["token"], suggestion["suggestion"] }; // dict value = {error, correction}
- }
-
- // Apply spelling corrections, in order, from right to left
- foreach (int i in corrections.Keys)
- {
- var oldtext = corrections[i][0];
- var newtext = corrections[i][1];
-
- // Apply capitalization from original text to correction - all caps or initial caps
- if (text.Substring(i, oldtext.Length).All(char.IsUpper)) newtext = newtext.ToUpper();
- else if (char.IsUpper(text[i])) newtext = newtext[0].ToString().ToUpper() + newtext.Substring(1);
-
- text = text.Substring(0, i) + newtext + text.Substring(i + oldtext.Length);
- }
- return text;
-}
-// NOTE:
-// In the following sections, we'll add code below this.
-```
-
-## Translate text on click
-
-The last thing that we need to do is create a method that is invoked when the **Translate** button in our user interface is clicked.
-
-1. In Visual Studio, open the tab for `MainWindow.xaml.cs`.
-1. Add this code to your project below the `CorrectSpelling()` method and save:
- ```csharp
- // ***** PERFORM TRANSLATION ON BUTTON CLICK
- private async void TranslateButton_Click(object sender, EventArgs e)
- {
- string textToTranslate = TextToTranslate.Text.Trim();
-
- string fromLanguage = FromLanguageComboBox.SelectedValue.ToString();
- string fromLanguageCode;
-
- // auto-detect source language if requested
- if (fromLanguage == "Detect")
- {
- fromLanguageCode = DetectLanguage(textToTranslate);
- if (!languageCodes.Contains(fromLanguageCode))
- {
- MessageBox.Show("The source language could not be detected automatically " +
- "or is not supported for translation.", "Language detection failed",
- MessageBoxButton.OK, MessageBoxImage.Error);
- return;
- }
- }
- else
- fromLanguageCode = languageCodesAndTitles[fromLanguage];
-
- string toLanguageCode = languageCodesAndTitles[ToLanguageComboBox.SelectedValue.ToString()];
-
- // spell-check the source text if the source language is English
- if (fromLanguageCode == "en")
- {
- if (textToTranslate.StartsWith("-")) // don't spell check in this case
- textToTranslate = textToTranslate.Substring(1);
- else
- {
- textToTranslate = CorrectSpelling(textToTranslate);
- TextToTranslate.Text = textToTranslate; // put corrected text into input field
- }
- }
- // handle null operations: no text or same source/target languages
- if (textToTranslate == "" || fromLanguageCode == toLanguageCode)
- {
- TranslatedTextLabel.Content = textToTranslate;
- return;
- }
-
- // send HTTP request to perform the translation
- string endpoint = string.Format(TEXT_TRANSLATION_API_ENDPOINT, "translate");
- string uri = string.Format(endpoint + "&from={0}&to={1}", fromLanguageCode, toLanguageCode);
-
- System.Object[] body = new System.Object[] { new { Text = textToTranslate } };
- var requestBody = JsonConvert.SerializeObject(body);
-
- using (var client = new HttpClient())
- using (var request = new HttpRequestMessage())
- {
- request.Method = HttpMethod.Post;
- request.RequestUri = new Uri(uri);
- request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
- request.Headers.Add("Ocp-Apim-Subscription-Key", COGNITIVE_SERVICES_KEY);
- request.Headers.Add("Ocp-Apim-Subscription-Region", "westus");
- request.Headers.Add("X-ClientTraceId", Guid.NewGuid().ToString());
-
- var response = await client.SendAsync(request);
- var responseBody = await response.Content.ReadAsStringAsync();
-
- var result = JsonConvert.DeserializeObject<List<Dictionary<string, List<Dictionary<string, string>>>>>(responseBody);
- var translation = result[0]["translations"][0]["text"];
-
- // Update the translation field
- TranslatedTextLabel.Content = translation;
- }
- }
- ```
-
-The first step is to get the "from" and "to" languages, and the text the user entered into our form. If the source language is set to **Detect**, `DetectLanguage()` is called to determine the language of the source text. The text might be in a language that the Translator doesn't support. In that case, display a message to inform the user, and return without translating the text.
-
-If the source language is English (whether specified or detected), check the spelling of the text with `CorrectSpelling()` and apply any corrections. The corrected text is added back into the text area so that the user sees that a correction was made.
-
-The code to translate text should look familiar: build the URI, create a request, send it, and parse the response. The JSON array may contain more than one object for translation, however, our app only requires one.
-
-After a successful request, `TranslatedTextLabel.Content` is replaced with the `translation`, which updates the user interface to display the translated text.
-
-## Run your WPF app
-
-That's it, you have a working translation app built using WPF. To run your app, click the **Start** button in Visual Studio.
-
-## Source code
-
-Source code for this project is available on GitHub.
-
-* [Explore source code](https://github.com/MicrosoftTranslator/Text-Translation-API-V3-C-Sharp-Tutorial)
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Microsoft Translator reference](./reference/v3-0-reference.md)
cognitive-services Local Categories https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/local-categories.md
# Search categories for the Bing Local Business Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Local Business Search API enables you to search for local business entities in a variety of categories, with priority given to results close a user's location. You can include these searches in searches along with the `localCircularView` and `localMapView` [parameters](specify-geographic-search.md).
cognitive-services Local Search Query Response https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/local-search-query-response.md
# Sending and using Bing Local Business Search API queries and responses
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
You can get local results from the Bing Local Business Search API by sending a search query to its endpoint and including the `Ocp-Apim-Subscription-Key` header, which is required. Along with available [headers](local-search-reference.md#headers) and [parameters](local-search-reference.md#query-parameters), Searches can be customized by specifying [geographic boundaries](specify-geographic-search.md) for the area to be searched, and the [categories](local-search-query-response.md) of places returned.
cognitive-services Local Search Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/local-search-reference.md
# Bing Local Business Search API v7 reference
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Local Business Search API sends a search query to Bing to get results that include restaurants, hotels, or other local businesses. For places, the query can specify the name of the local business or a category (for example, restaurants near me). Entity results include persons, places, or things. Place in this context is business entities, states, countries/regions, etc.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/overview.md
# What is Bing Local Business Search?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Local Business Search API is a RESTful service that enables your applications to find information about local businesses based on search queries. For example, `q=<business-name> in Redmond, Washington`, or `q=Italian restaurants near me`. ## Features
cognitive-services Local Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API in C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in C#, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
cognitive-services Local Search Java Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-search-java-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API using Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in Java, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
cognitive-services Local Search Node Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-search-node-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API using Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in Node.js, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
cognitive-services Local Search Python Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/quickstarts/local-search-python-quickstart.md
# Quickstart: Send a query to the Bing Local Business Search API in Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to learn how to send requests to the Bing Local Business Search API, which is an Azure Cognitive Service. Although this simple application is written in Python, the API is a RESTful Web service compatible with any programming language capable of making HTTP requests and parsing JSON.
import json
# Replace the subscriptionKey string value with your valid subscription key.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
subscriptionKey = 'YOUR-SUBSCRIPTION-KEY' host = 'api.cognitive.microsoft.com'
cognitive-services Specify Geographic Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-local-business-search/specify-geographic-search.md
# Use geographic boundaries to filter results from the Bing Local Business Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Local Business Search API enables you to set boundaries on the specific geographic area you'd like to search by using the `localCircularView` or `localMapView` query parameters. Be sure to use only one parameter in your queries.
cognitive-services Bing Insights Usage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/bing-insights-usage.md
Last updated 04/03/2019
# Examples of Bing insights usage
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This article contains examples of how Bing might use and display image insights on Bing.com.
cognitive-services Sending Queries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/concepts/sending-queries.md
# Sending search queries to the Bing Visual Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This article describes the parameters and attributes of requests sent to the Bing Visual Search API, as well as the response object.
cognitive-services Default Insights Tag https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/default-insights-tag.md
Last updated 04/04/2019
# Default insights tag
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The default insights tag is the one with the `displayName` field set to an empty string. The following example shows the possible list of default insights (actions). The list of actions the response includes depends on the image. And for each action, the list of properties may vary by image, so check if the property exists before trying to use it.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/language-support.md
Last updated 09/25/2018
# Language and region support for the Bing Visual Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Visual Search API supports more than three dozen countries/regions, many with more than one language. Each request should include the user's country/region and language of choice. Knowing the user's market helps Bing return appropriate results. If you don't specify a country/region and language, Bing makes a best effort to determine the user's country/region and language. Because the results may contain links to Bing, knowing the country/region and language may provide a preferred localized Bing user experience if the user clicks the Bing links.
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/overview.md
Last updated 12/19/2019
# What is the Bing Visual Search API?
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search API returns insights for an image. You can either upload an image or provide a URL to one. Insights are visually similar images, shopping sources, webpages that include the image, and more. Insights returned by the Bing Visual Search API are similar to ones shown on Bing.com/images.
cognitive-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/client-libraries.md
# Quickstart: Use the Bing Visual Search client library
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
::: zone pivot="programming-language-csharp"
cognitive-services Csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/csharp.md
# Quickstart: Get image insights using the Bing Visual Search REST API and C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
This quickstart demonstrates how to upload an image to the Bing Visual Search API and view the insights that it returns.
cognitive-services Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/go.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Go
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API using the Go programming language. A POST request uploads an image to the API endpoint. The results include URLs and descriptive information about images similar to the uploaded image.
cognitive-services Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/java.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Java
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API. This Java application uploads an image to the API and displays the information it returns. Although this application is written in Java, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/nodejs.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Node.js
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API. This simple JavaScript application uploads an image to the API, and displays the information returned about it. Although this application is written in JavaScript, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/python.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Python
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API. This Python application uploads an image to the API and displays the information it returns. Although this application is written in Python, the API is a RESTful Web service compatible with most programming languages.
cognitive-services Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/quickstarts/ruby.md
# Quickstart: Get image insights using the Bing Visual Search REST API and Ruby
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Use this quickstart to make your first call to the Bing Visual Search API using the Ruby programming language. A POST request uploads an image to the API endpoint. The results include URLs and descriptive information about images similar to the uploaded image.
cognitive-services Tutorial Bing Visual Search Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-bing-visual-search-single-page-app.md
# Tutorial: Create a Visual Search single-page web app
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search API returns insights for an image. You can either upload an image or provide a URL to one. Insights are visually similar images, shopping sources, webpages that include the image, and more. Insights returned by the Bing Visual Search API are similar to ones shown on Bing.com/images.
cognitive-services Tutorial Visual Search Crop Area Results https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-visual-search-crop-area-results.md
# Tutorial: Crop an image with the Bing Visual Search SDK for C#
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search SDK enables you to crop an image before finding similar online images. This application crops a single person from an image containing several people, and then returns search results containing similar images found online.
cognitive-services Tutorial Visual Search Image Upload https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-visual-search-image-upload.md
# Tutorial: Upload images to the Bing Visual Search API
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Bing Visual Search API enables you to search the web for images similar to ones you upload. Use this tutorial to create a web application that can send an image to the API, and display the insights it returns within the webpage. Note that this application does not adhere to all [Bing Use and Display Requirements](../bing-web-search/use-display-requirements.md) for using the API.
cognitive-services Tutorial Visual Search Insights Token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/tutorial-visual-search-insights-token.md
# Tutorial: Find similar images from previous searches using an image insights token
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
The Visual Search client library enables you to find images online from previous searches that return an `ImageInsightsToken`. This application gets an `ImageInsightsToken` and uses the token in a subsequent search. It then sends the `ImageInsightsToken` to Bing and returns results that include Bing Search URLs and URLs of similar images found online.
cognitive-services Use Insights Token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/bing-visual-search/use-insights-token.md
# Use an insights token to get insights for an image
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Bing Visual Search API returns information about an image that you provide. You can provide the image by using the URL of the image, an insights token, or by uploading an image. For information about these options, see [What is Bing Visual Search API?](overview.md). This article demonstrates using an insights token. For examples that demonstrate how to upload an image to get insights, see the quickstarts:
To run this application, follow these steps:
# Download and install Python at https://www.python.org/
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
# Run the following in a command console window
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
# pip3 install requests
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
- import requests import json
HEADERS = {'Ocp-Apim-Subscription-Key': SUBSCRIPTION_KEY}
# To get an insights, call the /images/search endpoint. Get the token from
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
# the imageInsightsToken field in the Image object.
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
insightsToken = 'ccid_tmaGQ2eU*mid_D12339146CFEDF3D409CC7A66D2C98D0D71904D4*simid_608022145667564759*thid_OIP.tmaGQ2eUI1yq3yll!_jn9kwHaFZ' formData = '{"imageInfo":{"imageInsightsToken":"' + insightsToken + '"}}'
def print_json(obj):
# Main execution
-> [!WARNING]
-> Bing Search APIs are moving from Cognitive Services to Bing Search Services. Starting **October 30, 2020**, any new instances of Bing Search need to be provisioned following the process documented [here](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
-> Bing Search APIs provisioned using Cognitive Services will be supported for the next three years or until the end of your Enterprise Agreement, whichever happens first.
-> For migration instructions, see [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
if __name__ == '__main__': main() ```
if __name__ == '__main__':
[Create a Visual Search single-page web app](tutorial-bing-visual-search-single-page-app.md) [What is the Bing Visual Search API?](overview.md) [Try Cognitive Services](https://aka.ms/bingvisualsearchtryforfree)
-[Images - Visual Search](/rest/api/cognitiveservices/bingvisualsearch/images/visualsearch)
+[Images - Visual Search](/rest/api/cognitiveservices/bingvisualsearch/images/visualsearch)
cognitive-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/conversational-language-understanding/how-to/train-model.md
The training times can be anywhere from a few seconds when dealing with orchestr
## Train model
-Enter a new model name or select an existing model from the **Model Name** dropdown. Press the enter key after you add a model name. Select whether you want to evaluate your model by changing the **Run evaluation with training** toggle. If enabled, your tagged utterances will be spilt into 3 parts; 80% for training, 10% for validation and 10% for testing. Afterwards, you'll be able to see the model's evaluation results.
+Select **Train model** on the left of the screen. Select **Start a training job** from the top menu.
+
+Enter a new model name or select an existing model from the **Model Name** dropdown.
+
+Select whether you want to evaluate your model by changing the **Run evaluation with training** toggle. If enabled, your tagged utterances will be spilt into 2 parts; 80% for training, 20% for testing. Afterwards, you'll be able to see the model's evaluation results.
:::image type="content" source="../media/train-model.png" alt-text="A screenshot showing the Train model page for Conversational Language Understanding projects." lightbox="../media/train-model.png":::
-Click the **Train** button and wait for training to complete. You will see the training status of your model in the view model details page.
+Click the **Train** button and wait for training to complete. You will see the training status of your model in the view model details page. Only successfully completed tasks will generate models.
## Evaluate model
cognitive-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/custom-classification/service-limits.md
Custom text classification is only available select Azure regions. When you crea
* You can't rename your project after creation.
+* Your project name must only contain alphanumeric characters (letters and numbers). Spaces and special characters are not allowed. Project names can have a maximum of 50 characters.
+ * You must have minimum of 10 files in your project and a maximum of 1,000,000 files. * You can have up to 10 trained models per project. * Model names have to be unique within the same project.
+* Model name must only contain alphanumeric characters (letters and numbers). Spaces and special characters are not allowed. Project names can have a maximum of 50 characters.
+ * You can't rename your model after creation. * You can only train one model at a time per project.
Custom text classification is only available select Azure regions. When you crea
| Attribute | Limits | |--|--|
-| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
-| Model name | You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]` |
-| entity names| You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]` |
-| File names | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
+| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. Maximum allowed length is 50 characters. |
+| Model name | You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]`. Maximum allowed length is 50 characters. |
+| Class name| You can only use letters `(a-z, A-Z)`, numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]`. Maximum allowed length is 50 characters. |
+| File name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
cognitive-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/custom-named-entity-recognition/service-limits.md
Use this article to learn about the data and service limits when using Custom NE
* All files should be available at the root of your container.
-* Maximum allowed length for your file sis 128,000 characters, which is approximately 28,000 words or 56 pages.
+* Maximum allowed length for your file is 128,000 characters, which is approximately 28,000 words or 56 pages.
* Your [training dataset](how-to/train-model.md#data-split) should include at least 10 files and not more than 100,000 files. - ## APIs limits * When using the Authoring API, there is a maximum of 10 POST requests and 100 GET requests per minute.
Custom text classification is only available select Azure regions. When you crea
## Project limits
-* You can only connect 1 storage container for each project. This process is irreversible. If you connect a container to your project, you cannot disconnect it later.
+* You can only connect 1 storage account for each project. This process is irreversible. If you connect a storage account to your project, you cannot disconnect it later.
* You can only have 1 [tags file](how-to/tag-data.md) per project. You cannot change to a different tags file later. You can only update the tags within your project. * You cannot rename your project after creation.
+* Your project name must only contain alphanumeric characters (letters and numbers). Spaces and special characters are not allowed. Project names can have a maximum of 50 characters.
+ * You must have minimum of 10 tagged files in your project and a maximum of 100,000 files.
-* You can have up to 10 trained models per project.
+* You can have up to 50 trained models per project.
* Model names have to be unique within the same project.
+* Model names must only contain alphnumeric characters,only letters and numbers, no spaces or special characters are allowed). Model name must have a maximum of 50 characters.
+ * You cannot rename your model after creation. * You can only train one model at a time per project.
Custom text classification is only available select Azure regions. When you crea
* It is recommended to have around 200 tagged instances per entity and you must have a minimum of 10 of tagged instances per entity.
+* Entity names must have a maximum of 50 characters.
+ ## Naming limits | Attribute | Limits | |--|--|
-| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
-| Model name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
-| Entity names| You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]` |
-| File names | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
+| Project name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. Maximum length allowed is 50 characters. |
+| Model name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. Maximum length allowed is 50 characters. |
+| Entity name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` and symbols `@ # _ . , ^ \ [ ]`. Maximum length allowed is 50 characters. |
+| File name | You can only use letters `(a-z, A-Z)`, and numbers `(0-9)` with no spaces. |
## Next steps
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/overview.md
Previously updated : 11/02/2021 Last updated : 02/01/2022
Azure Cognitive Service for Language provides the following features:
> [!div class="mx-tdCol2BreakAll"] > |Feature |Description | Deployment options| > ||||
-> | [Named Entity Recognition (NER)](named-entity-recognition/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](named-entity-recognition/quickstart.md) |
-> | [Personally Identifiable Information (PII) detection](personally-identifiable-information/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories of sensitive information, such as account information. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](named-entity-recognition/quickstart.md) |
-> | [Key phrase extraction](key-phrase-extraction/overview.md) | This pre-configured feature evaluates unstructured text, and for each input document, returns a list of key phrases and main points in the text. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](key-phrase-extraction/quickstart.md) <br> ΓÇó [Docker container](key-phrase-extraction/how-to/use-containers.md) |
-> |[Entity linking](entity-linking/overview.md) | This pre-configured feature disambiguates the identity of an entity found in text and provides links to the entity on Wikipedia. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](entity-linking/quickstart.md) |
-> | [Text Analytics for health](text-analytics-for-health/overview.md) | This pre-configured feature extracts information from unstructured medical texts, such as clinical notes and doctor's notes. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](text-analytics-for-health/quickstart.md) <br> ΓÇó [Docker container](text-analytics-for-health/how-to/use-containers.md) |
-> | [Custom NER](custom-named-entity-recognition/overview.md) | Build an AI model to extract custom entity categories, using unstructured text that you provide. | ΓÇó [Language Studio](custom-named-entity-recognition/quickstart.md?pivots=language-studio) <br> ΓÇó [REST API](custom-named-entity-recognition/quickstart.md?pivots=rest-api) |
-> | [Analyze sentiment and opinions](sentiment-opinion-mining/overview.md) | This pre-configured feature provides sentiment labels (such as "*negative*", "*neutral*" and "*positive*") for sentences and documents. This feature can additionally provide granular information about the opinions related to words that appear in the text, such as the attributes of products or services. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](sentiment-opinion-mining/quickstart.md) <br> ΓÇó [Docker container](sentiment-opinion-mining/how-to/use-containers.md)
-> |[Language detection](language-detection/overview.md) | This pre-configured feature evaluates text, and determines the language it was written in. It returns a language identifier and a score that indicates the strength of the analysis. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](language-detection/quickstart.md) <br> ΓÇó [Docker container](language-detection/how-to/use-containers.md) |
-> |[Custom text classification (preview)](custom-classification/overview.md) | Build an AI model to classify unstructured text into custom classes that you define. | ΓÇó [Language Studio](custom-classification/quickstart.md?pivots=language-studio)<br> ΓÇó [REST API](language-detection/quickstart.md?pivots=rest-api) |
-> | [Text Summarization (preview)](text-summarization/overview.md) | This pre-configured feature extracts key sentences that collectively convey the essence of a document. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](text-summarization/quickstart.md) |
-> | [Conversational language understanding (preview)](conversational-language-understanding/overview.md) | Build an AI model to bring the ability to understand natural language into apps, bots, and IoT devices. | ΓÇó [Language Studio](conversational-language-understanding/quickstart.md)
-> | [Question answering](question-answering/overview.md) | This pre-configured feature provides answers to questions extracted from text input, using semi-structured content such as: FAQs, manuals, and documents. | ΓÇó [Language Studio](language-studio.md) <br> ΓÇó [REST API and client-library](question-answering/quickstart/sdk.md) |
+> | [Named Entity Recognition (NER)](named-entity-recognition/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](named-entity-recognition/quickstart.md) |
+> | [Personally Identifiable Information (PII) detection](personally-identifiable-information/overview.md) | This pre-configured feature identifies entities in text across several pre-defined categories of sensitive information, such as account information. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](personally-identifiable-information/quickstart.md) |
+> | [Key phrase extraction](key-phrase-extraction/overview.md) | This pre-configured feature evaluates unstructured text, and for each input document, returns a list of key phrases and main points in the text. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](key-phrase-extraction/quickstart.md) <br> * [Docker container](key-phrase-extraction/how-to/use-containers.md) |
+> |[Entity linking](entity-linking/overview.md) | This pre-configured feature disambiguates the identity of an entity found in text and provides links to the entity on Wikipedia. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](entity-linking/quickstart.md) |
+> | [Text Analytics for health](text-analytics-for-health/overview.md) | This pre-configured feature extracts information from unstructured medical texts, such as clinical notes and doctor's notes. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](text-analytics-for-health/quickstart.md) <br> * [Docker container](text-analytics-for-health/how-to/use-containers.md) |
+> | [Custom NER](custom-named-entity-recognition/overview.md) | Build an AI model to extract custom entity categories, using unstructured text that you provide. | * [Language Studio](custom-named-entity-recognition/quickstart.md?pivots=language-studio) <br> * [REST API](custom-named-entity-recognition/quickstart.md?pivots=rest-api) |
+> | [Analyze sentiment and opinions](sentiment-opinion-mining/overview.md) | This pre-configured feature provides sentiment labels (such as "*negative*", "*neutral*" and "*positive*") for sentences and documents. This feature can additionally provide granular information about the opinions related to words that appear in the text, such as the attributes of products or services. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](sentiment-opinion-mining/quickstart.md) <br> * [Docker container](sentiment-opinion-mining/how-to/use-containers.md)
+> |[Language detection](language-detection/overview.md) | This pre-configured feature evaluates text, and determines the language it was written in. It returns a language identifier and a score that indicates the strength of the analysis. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](language-detection/quickstart.md) <br> * [Docker container](language-detection/how-to/use-containers.md) |
+> |[Custom text classification (preview)](custom-classification/overview.md) | Build an AI model to classify unstructured text into custom classes that you define. | * [Language Studio](custom-classification/quickstart.md?pivots=language-studio)<br> * [REST API](language-detection/quickstart.md?pivots=rest-api) |
+> | [Text Summarization (preview)](text-summarization/overview.md) | This pre-configured feature extracts key sentences that collectively convey the essence of a document. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](text-summarization/quickstart.md) |
+> | [Conversational language understanding (preview)](conversational-language-understanding/overview.md) | Build an AI model to bring the ability to understand natural language into apps, bots, and IoT devices. | * [Language Studio](conversational-language-understanding/quickstart.md)
+> | [Question answering](question-answering/overview.md) | This pre-configured feature provides answers to questions extracted from text input, using semi-structured content such as: FAQs, manuals, and documents. | * [Language Studio](language-studio.md) <br> * [REST API and client-library](question-answering/quickstart/sdk.md) |
## Tutorials
After you've had a chance to get started with the Language service, try our tuto
* [Extract key phrases from text stored in Power BI](key-phrase-extraction/tutorials/integrate-power-bi.md) * [Use Power Automate to sort information in Microsoft Excel](named-entity-recognition/tutorials/extract-excel-information.md)
-* [Use Flask to translate text, analyze sentiment, and synthesize speech](../translator/tutorial-build-flask-app-translation-synthesis.md?context=%2fazure%2fcognitive-services%2flanguage-service%2fcontext%2fcontext)
+* [Use Flask to translate text, analyze sentiment, and synthesize speech](/learn/modules/python-flask-build-ai-web-app/)
* [Use Cognitive Services in canvas apps](/powerapps/maker/canvas-apps/cognitive-services-api?context=/azure/cognitive-services/language-service/context/context) * [Create a FAQ Bot](question-answering/tutorials/bot-service.md)
An AI system includes not only the technology, but also the people who will use
* [Transparency note for the Language service](/legal/cognitive-services/text-analytics/transparency-note) * [Integration and responsible use](/legal/cognitive-services/text-analytics/guidance-integration-responsible-use)
-* [Data, privacy, and security](/legal/cognitive-services/text-analytics/data-privacy)
+* [Data, privacy, and security](/legal/cognitive-services/text-analytics/data-privacy)
cognitive-services Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/language-service/question-answering/concepts/best-practices.md
Question answering takes casing into account but it's intelligent enough to unde
### How are question answer pairs prioritized for multi-turn questions?
-When a knowledge base has hierarchical relationships (either added manually or via extraction) and the previous response was an answer related to other question answer pairs, for the next query we give slight preference to all the children question answer pairs, sibling question answer pairs, and grandchildren question answer pairs in that order. Along with any query, the [Question Answering REST API](https://docs.microsoft.com/rest/api/cognitiveservices/questionanswering/question-answering/get-answers) expects a `context` object with the property `previousQnAId`, which denotes the last top answer. Based on this previous `QnAID`, all the related `QnAs` are boosted.
+When a knowledge base has hierarchical relationships (either added manually or via extraction) and the previous response was an answer related to other question answer pairs, for the next query we give slight preference to all the children question answer pairs, sibling question answer pairs, and grandchildren question answer pairs in that order. Along with any query, the [Question Answering REST API](/rest/api/cognitiveservices/questionanswering/question-answering/get-answers) expects a `context` object with the property `previousQnAId`, which denotes the last top answer. Based on this previous `QnAID`, all the related `QnAs` are boosted.
### How are accents treated?
cognitive-services What Are Cognitive Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/what-are-cognitive-services.md
Title: What are Azure Cognitive Services?
-description: Cognitive Services makes AI accessible to every developer without requiring machine-learning and data-science expertise. You just need to make an API call from your application to add the ability to see (advanced image search and recognition), hear, speak, search, and decision-making into your apps.
+description: Cognitive Services makes AI accessible to every developer without requiring machine-learning and data-science expertise. You need to make an API call from your application to add the ability to see (advanced image search and recognition), hear, speak, search, and decision-making into your apps.
keywords: cognitive services, cognitive intelligence, cognitive solutions, ai services, cognitive understanding, cognitive features Previously updated : 01/05/2022 Last updated : 01/31/2022 # What are Azure Cognitive Services?
-Azure Cognitive Services are cloud-based services with REST APIs and client library SDKs available to help you build cognitive intelligence into your applications. You can add cognitive features to your applications without having artificial intelligence (AI) or data science skills. Azure Cognitive Services comprise various AI services that enable you to build cognitive solutions that can see, hear, speak, understand, and even make decisions.
+Azure Cognitive Services are cloud-based services with REST APIs, client library SDKs, and user interfaces available to help you build cognitive intelligence into your applications. You can add cognitive features to your applications without having artificial intelligence (AI) or data science skills. Cognitive Services comprises various AI services that enable you to build cognitive solutions that can see, hear, speak, understand, and even make decisions.
## Categories of Cognitive Services
-The catalog of cognitive services that provide cognitive understanding is categorized into four main pillars:
+Cognitive Services can be categorized into four main pillars:
* Vision * Speech * Language * Decision
-The following sections in this article provide a list of services that are part of these four pillars.
- ## Vision APIs |Service Name|Service Description| |:--|:| |[Computer Vision](./computer-vision/index.yml "Computer Vision")|The Computer Vision service provides you with access to advanced cognitive algorithms for processing images and returning information. See [Computer Vision quickstart](./computer-vision/quickstarts-sdk/client-library.md) to get started with the service.|
-|[Custom Vision Service](./custom-vision-service/index.yml "Custom Vision Service")|The Custom Vision Service lets you build, deploy, and improve your own image classifiers. An image classifier is an AI service that applies labels to images, based on their visual characteristics. |
+|[Custom Vision](./custom-vision-service/index.yml "Custom Vision Service")|The Custom Vision Service lets you build, deploy, and improve your own image classifiers. An image classifier is an AI service that applies labels to images, based on their visual characteristics. |
|[Face](./face/index.yml "Face")| The Face service provides access to advanced face algorithms, enabling face attribute detection and recognition. See [Face quickstart](./face/quickstarts/client-libraries.md) to get started with the service.| ## Speech APIs
The following sections in this article provide a list of services that are part
|[Bing Speech](./speech-service/how-to-migrate-from-bing-speech.md "Bing Speech") (Retiring)|The Bing Speech API provides you with an easy way to create speech-enabled features in your applications.| |[Translator Speech](/azure/cognitive-services/translator-speech/ "Translator Speech") (Retiring)|Translator Speech is a machine translation service.| -->+ ## Language APIs |Service Name|Service Description| |:--|:|
-|[Azure Cognitive Service for language](./language-service/index.yml "Language service")| Azure Cognitive Service for Language provides several Natural Language Processing (NLP) features for understanding and analyzing text.|
-|[Language Understanding LUIS](./luis/index.yml "Language Understanding")|Language Understanding (LUIS) is a cloud-based conversational AI service that applies custom machine-learning intelligence to a user's conversational, natural language text to predict overall meaning, and pull out relevant, detailed information. [See LUIS quickstart](./luis/luis-get-started-create-app.md) to get started with the service.|
+|[Azure Cognitive Service for language](./language-service/index.yml "Language service")| Azure Cognitive Service for Language provides several Natural Language Processing (NLP) features to understand and analyze text.|
+|[Translator](./translator/index.yml "Translator")|Translator provides machine-based text translation in near real time.|
+|[Language Understanding LUIS](./luis/index.yml "Language Understanding")|Language Understanding (LUIS) is a cloud-based conversational AI service that applies custom machine-learning intelligence to a user's conversational or natural language text to predict overall meaning and pull out relevant information. [See LUIS quickstart](./luis/luis-get-started-create-app.md) to get started with the service.|
|[QnA Maker](./qnamaker/index.yml "QnA Maker")|QnA Maker allows you to build a question and answer service from your semi-structured content. [See QnA Maker quickstart](./qnamaker/quickstarts/create-publish-knowledge-base.md) to get started with the service.|
-|[Translator](./translator/index.yml "Translator")|Translator provides machine-based text translation in near real-time.|
## Decision APIs
Start by creating a Cognitive Services resource with hands-on quickstarts using
* [Azure portal](cognitive-services-apis-create-account.md?tabs=multiservice%2Cwindows "Azure portal") * [Azure CLI](cognitive-services-apis-create-account-cli.md?tabs=windows "Azure CLI") * [Azure SDK client libraries](cognitive-services-apis-create-account-cli.md?tabs=windows "cognitive-services-apis-create-account-client-library?pivots=programming-language-csharp")
-* [Azure Resource Manager (ARM) templates](./create-account-resource-manager-template.md?tabs=portal "Azure Resource Manager (ARM) templates")
+* [Azure Resource Manager (ARM template)](./create-account-resource-manager-template.md?tabs=portal "Azure Resource Manager (ARM template)")
## Using Cognitive Services in different development environments
All APIs have a free tier, which has usage and throughput limits. You can incre
## Using Cognitive Services securely
-Azure Cognitive Services provides a layered security model, including [authentication](authentication.md "authentication") via Azure Active Directory credentials, a valid resource key, and [Azure Virtual Networks](cognitive-services-virtual-networks.md "Azure Virtual Networks").
+Azure Cognitive Services provides a layered security model, including [authentication](authentication.md "Authentication") via Azure Active Directory credentials, a valid resource key, and [Azure Virtual Networks](cognitive-services-virtual-networks.md "Azure Virtual Networks").
## Containers for Cognitive Services
- Azure Cognitive Services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Cognitive Services closer to your data for compliance, security or other operational reasons. Learn more about [Cognitive Services Containers](cognitive-services-container-support.md "Cognitive Services Containers").
+ Azure Cognitive Services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Cognitive Services closer to your data for compliance, security, or other operational reasons. For more information, see [Cognitive Services Containers](cognitive-services-container-support.md "Cognitive Services Containers").
## Regional availability
Looking for a region we don't support yet? Let us know by filing a feature reque
## Supported cultural languages
-Cognitive Services supports a wide range of cultural languages at the service level. You can find the language availability for each API in the [supported languages list](language-support.md "supported languages list").
+Cognitive Services supports a wide range of cultural languages at the service level. You can find the language availability for each API in the [supported languages list](language-support.md "Supported languages list").
## Certifications and compliance
-Cognitive Services has been awarded certifications such as CSA STAR Certification, FedRAMP Moderate, and HIPAA BAA. You can [download](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942 "download") certifications for your own audits and security reviews.
+Cognitive Services has been awarded certifications such as CSA STAR Certification, FedRAMP Moderate, and HIPAA BAA. You can [download](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942 "Download") certifications for your own audits and security reviews.
To understand privacy and data management, go to the [Trust Center](https://servicetrust.microsoft.com/ "Trust Center").
communication-services Custom Teams Endpoint Authentication Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/interop/custom-teams-endpoint-authentication-overview.md
+
+ Title: Authentication of custom Teams endpoint
+
+description: This article discusses authentication of a custom Teams endpoint.
+++++ Last updated : 06/30/2021+++++
+# Authentication flow cases
+
+Azure Communication Services provides developers the ability to build custom Teams calling experience with Communication Services calling software development kit (SDK). This article provides insights into the process of authentication and describes individual authentication artifacts. In the following use cases, we'll demonstrate authentication for single and multi-tenant Azure Active Directory (Azure AD) applications.
+
+## Case 1: Single-tenant application
+The following scenario shows an example of the company Fabrikam, which has built custom Teams calling application for internal use within a company. All Teams users are managed by Azure Active Directory. The access to the Azure Communication Services is controlled via Azure role-based access control (Azure RBAC).
++
+![Diagram of the process for authenticating Teams user for accessing Fabrikam client application and Fabrikam Azure Communication Services resource.](./media/custom-teams-endpoint/authentication-case-single-tenant-azure-rbac-overview.svg)
+
+The following sequence diagram is showing detailed steps of the authentication:
++
+Prerequisites:
+- Alice or her Azure AD Administrator needs to provide consent to the Fabrikam's Azure Active Directory Application before first sign in. To learn more about [consent flow](https://docs.microsoft.com/azure/active-directory/develop/consent-framework).
+- The admin of the Azure Communication Services resource must grant Alice permission to perform this action. You can learn about the [Azure RBAC role assignment](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-portal).
+
+Steps:
+1. Authentication of Alice from Fabrikam against Fabrikam's Azure Active Directory: This step is standard OAuth flow leveraging Microsoft Authentication Library (MSAL) to authenticate against Fabrikam's Azure Active Directory. Alice is authenticating for Fabrikam's Azure AD application. If the authentication of Alice is successful, Fabrikam's Client application receives Azure AD access token 'A'. Details of the token are captured below. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
+1. Get access token for Alice: This flow is initiated from the Fabrikam's Client application and performs control plane logic authorized by artifact 'A' to retrieve Fabrikam's Azure Communication Services access token 'D' for Alice. Details of the token are captured below. This access token can be used for data plane actions in Azure Communication Services such as calling. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
+1. Start a call to Bob from Fabrikam: Alice is using Azure Communication Services access token to make a call to Teams user Bob via Communication Services calling SDK. You can learn more about the [developer experience in the quickstart](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
+
+Artifacts:
+- Artifact A
+ - Type: Azure AD access token
+ - Audience: _`Azure Communication Services`_ ΓÇö control plane
+ - Azure AD application ID: Fabrikam's _`Azure AD application ID`_
+ - Permission: _`https://auth.msft.communication.azure.com/Teams.ManageCalls`_
+- Artifact D
+ - Type: Azure Communication Services access token
+ - Audience: _`Azure Communication Services`_ ΓÇö data plane
+ - Azure Communication Services Resource ID: Fabrikam's _`Azure Communication Services Resource ID`_
+
+## Case 2: Multi-tenant application
+The following scenario shows an example of company Contoso, which has built custom Teams calling application for external customers, such as the company Fabrikam. Contoso infrastructure uses custom authentication within the Contoso infrastructure. Contoso infrastructure is using a connection string to retrieve the token for Fabrikam's Teams user.
+
+![Diagram of the process for authenticating Fabrikam Teams user for accessing Contoso client application and Contoso Azure Communication Services resource.](./media/custom-teams-endpoint/authentication-case-multiple-tenants-hmac-overview.svg)
+
+The following sequence diagram is showing detailed steps of the authentication:
++
+Prerequisites:
+- Alice or her Azure AD Administrator needs to provide consent to the Contoso's Azure Active Directory Application before first sign in. To learn more about [consent flow](https://docs.microsoft.com/azure/active-directory/develop/consent-framework).
+
+Steps:
+1. Authentication of Alice from Fabrikam against Fabrikam's Azure Active Directory: This step is standard OAuth flow using Microsoft Authentication Library (MSAL) to authenticate against Fabrikam's Azure Active Directory. Alice is authenticating for Contoso's Azure AD application. If the authentication of Alice is successful, Contoso's Client application receives Azure AD access token 'A'. Details of the token are captured below. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
+1. Get access token for Alice: This flow is initiated from Contoso's client application and performs control plane logic authorized by artifact 'A' to retrieve Contoso's Azure Communication Services access token 'D' for Alice. Details of the token are captured below. This access token can be used for data plane actions in Azure Communication Services such as calling. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md). (https://docs.microsoft.com/azure/role-based-access-control/role-assignments-portal).
+1. Start a call to Bob from Fabrikam: Alice is using Azure Communication Services access token to make a call to Teams user Bob via Communication Services calling SDK. You can learn more about the [developer experience in the quickstart](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
++
+Artifacts:
+- Artifact A
+ - Type: Azure AD access token
+ - Audience: Azure Communication Services ΓÇö control plane
+ - Azure AD application ID: Contoso's _`Azure AD application ID`_
+ - Permission: _`https://auth.msft.communication.azure.com/Teams.ManageCalls`_
+- Artifact B
+ - Type: Custom Contoso authentication artifact
+- Artifact C
+ - Type: Hash-based Message Authentication Code (HMAC) (based on Contoso's _`connection string`_)
+- Artifact D
+ - Type: Azure Communication Services access token
+ - Audience: _`Azure Communication Services`_ ΓÇö data plane
+ - Azure Communication Services Resource ID: Contoso's _`Azure Communication Services Resource ID`_
+
+## Next steps
+
+The following articles might be of interest to you:
+
+- Learn more about [authentication](../authentication.md).
+- Try [quickstart for authentication of Teams users](../../quickstarts/manage-teams-identity.md).
+- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
communication-services Custom Teams Endpoint Firewall Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/interop/custom-teams-endpoint-firewall-configuration.md
+
+ Title: Firewall configuration
+
+description: This article describes firewall configuration requirements to enable a custom Teams endpoint.
+++++ Last updated : 06/30/2021+++++
+# Firewall configuration
+
+Azure Communication Services provides the ability to leverage Communication Services calling Software development kit (SDK) to build custom Teams calling experience. To enable this experience, Administrators need to configure the firewall according to Communication Services and Microsoft Teams guidance. Communication Services requirements allow control plane, and Teams requirements allow calling experience. If an independent software vendor (ISV) provides the authentication experience, then instead of Communication Services configuration, use configuration guidance of the vendor.
+
+The following articles might be of interest to you:
+
+- Learn more about [Azure Communication Services firewall configuration](../voice-video-calling/network-requirements.md).
+- Learn about [Microsoft Teams firewall configuration](https://docs.microsoft.com/microsoft-365/enterprise/urls-and-ip-address-ranges?view=o365-worldwide#skype-for-business-online-and-microsoft-teams).
communication-services Custom Teams Endpoint Use Cases https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/interop/custom-teams-endpoint-use-cases.md
+
+ Title: Use cases for custom Teams endpoint
+
+description: This article describes use cases for a custom Teams endpoint.
+++++ Last updated : 06/30/2021+++++
+# Custom Teams Endpoint ΓÇö Use cases
+
+Microsoft Teams provides identities managed by Azure Active Directory and calling experiences controlled by Teams Admin Center and policies. Users might have assigned licenses to enable PSTN connectivity and advanced calling capabilities of Teams Phone System. Azure Communication Services are supporting Teams identities for managing Teams VoIP calls, Teams PSTN calls, and join Teams meetings. Developers might extend the Azure Communication Services with Graph API to provide contextual data from Microsoft 365 ecosystem. This page is providing inspiration on how to use existing Microsoft technologies to provide an end-to-end experience for calling scenarios with Teams users and Azure Communication Services calling SDKs.
+
+## Use case 1: Make outbound Teams PSTN call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to make Teams PSTN calls via a custom website that takes the identity of the Teams user and configuration of the PSTN connectivity assigned to that Teams user.
+
+![Diagram is showing user experience of Alice making Teams PSTN call to customer Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-pstn-out-overview.svg)
+
+The following sequence diagram shows detailed steps of initiation of a Teams PSTN call:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Load customers and their PSTN numbers: Contoso provides custom logic to retrieve the list of customers and their associated phone numbers. This list is rendered on the initial page to Alice.
+3. Initiate a call to Megan: Alice selects a button to initiate a PSTN call to Megan in the Contoso's Client application. Client application uses Azure Communication Services calling SDK to provide the calling capability. First, it creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then you need to start a call to Megan's phone number.
+
+```js
+const pstnCallee = { phoneNumber: '<MEGAN_PHONE_NUMBER_E164_FORMAT>' }
+const oneToOneCall = callAgent.startCall([pstnCallee], { threadId: '00000000-0000-0000-0000-000000000000' });
+```
+4. Connecting PSTN call to Megan: The call is routed through the Teams PSTN connectivity assigned to Alice, reaching the PSTN network and ringing the phone associated with the provided phone number. Megan sees an incoming call from the phone number associated with Alice's Teams user.
+5. Megans accepts the call: Megan accepts the call and the connection between Alice and Megan is established.
+
+## Use case 2: Receive inbound Teams PSTN call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to receive a Teams PSTN call via a custom website that takes the identity of the Teams user and configuration of the PSTN connectivity assigned to that Teams user.
+
+![Diagram is showing user experience of Alice receiving Teams PSTN call from customer Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-pstn-in-overview.svg)
+
+The following sequence diagram shows detailed steps for accepting incoming Teams PSTN calls:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Subscribe for receiving calls: Client application uses Azure Communication Services calling SDK to provide the calling capability. First, it creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then you subscribe to the incoming call event.
+
+```js
+const incomingCallHandler = async (args: { incomingCall: IncomingCall }) => {
+ // Get information about caller
+ var callerInfo = incomingCall.callerInfo
+
+ showIncomingCall(callerInfo,incomingCall);
+};
+callAgent.on('incomingCall', incomingCallHandler);
+```
+The method _showIncomingCall_ is a custom Contoso's method that will render a user interface to indicate incoming calls and two buttons to accept and decline the call. If you select accept button, then the following code is used:
+
+```js
+// Accept the call
+var call = await incomingCall.accept();
+```
+
+If you select the decline button, then the following code is used:
++
+```js
+// Reject the call
+incomingCall.reject();
+```
+
+3. Megan start's a call to PSTN number assigned to Teams user Alice: Megan uses her phone to call Alice. The carrier network will connect to Teams PSTN connectivity assigned to Alice and it will ring all Teams endpoints registered for Alice. It includes: Teams desktop, mobile, web clients, and applications based on Azure Communication Services calling SDK.
+4. Contoso's client application shows Megan's incoming call: Client application receives incoming call notification. _showIncomingCall_ method would use custom Contoso's logic to translate the phone number to customer's name (for example, a database storing key-value pairs consisting of a phone number and customer name). When the information is retrieved, the notification is shown to Alice in Contoso's client application.
+5. Alice accepts the call: Alice selects a button to accept the call and the connection between Alice and Megan is established.
++
+## Use case 3: Make outbound Teams VoIP call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to make Teams VoIP calls via a custom website that takes the identity of the Teams user.
+
+![Diagram is showing user experience of Alice making Teams VoIP call to colleague Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-voip-out-overview.svg)
+
+The following sequence diagram shows detailed steps for initiation of a Teams VoIP call:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Load users from Fabrikam's organization and their identifiers: Contoso client application utilizes Graph API to get a list of users from Fabrikam's tenant. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](https://docs.microsoft.com/graph/api/user-list).
+
+```
+GET https://graph.microsoft.com/v1.0/users
+Permissions: User.ReadBasic.All (delegated)
+Response: response.body.value[1].displayName; // ΓÇ¥Megan BowenΓÇ¥
+ response.body.value[1].id; // "e8b753b5-4117-464e-9a08-713e1ff266b3"
+```
+
+Contoso's client application will then show the list of users and the ability to initiate a call to a given user.
+
+3. Initiate a call to Megan: Alice selects a button to initiate a Teams VoIP call to Megan in the Contoso's Client application. Client application uses Azure Communication Services calling SDK to provide the calling capability. Calls in Teams Clients are associated with Teams chat. First, the application requests creation of a dedicated chat for the VoIP call.
+
+```
+POST https://graph.microsoft.com/v1.0/chats
+Body:
+{
+ "chatType": "oneOnOne",
+ "members": [
+ {
+ "@odata.type": "#microsoft.graph.aadUserConversationMember",
+ "roles": [
+ "owner"
+ ],
+ "user@odata.bind": "https://graph.microsoft.com/v1.0/users('8c0a1a67-50ce-4114-bb6c-da9c5dbcf6ca')"
+ },
+ {
+ "@odata.type": "#microsoft.graph.aadUserConversationMember",
+ "roles": [
+ "owner"
+ ],
+ "user@odata.bind": "https://graph.microsoft.com/v1.0/users('e8b753b5-4117-464e-9a08-713e1ff266b3')"
+ }
+ ]
+}
+Permissions: Chat.Create (delegated)
+Response: response.body.value.id; // "19:8c0a1a67-50ce-4114-bb6c-da9c5dbcf6ca_e8b753b5-4117-464e-9a08-713e1ff266b3@unq.gbl.spaces"
+```
+
+Then the client application creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then you start a call to Megan's Teams ID.
+
+```js
+var teamsUser = { microsoftTeamsUserId: 'e8b753b5-4117-464e-9a08-713e1ff266b3'};
+const oneToOneCall = callAgent.startCall([teamsUser], { threadId: '19:8c0a1a67-50ce-4114-bb6c-da9c5dbcf6ca_e8b753b5-4117-464e-9a08-713e1ff266b3@unq.gbl.spaces' });
+```
+
+4. Connecting VoIP call to Megan: The call is routed through the Teams and ringing Teams clients associated with Megan. Megan sees an incoming call from Alice with the name defined in the Azure AD.
+5. Megans accepts the call: Megan accepts the call and the connection between Alice and Megan is established.
++
+## Use case 4: Receive inbound Teams VoIP call
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to receive a Teams VoIP call via a custom website that takes the identity of the Teams user and applies routing policies applied to the Teams user.
+
+![Diagram is showing user experience of Alice receiving Teams VoIP call from customer Megan.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-voip-in-overview.svg)
+
+The following sequence diagram shows detailed steps for accepting incoming Teams VoIP calls:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Subscribe for receiving calls: Client application uses Azure Communication Services calling SDK to provide the calling capability. First, it creates an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then application subscribes to the incoming call event.
+
+```js
+const incomingCallHandler = async (args: { incomingCall: IncomingCall }) => {
+ // Get information about caller
+ var callerInfo = incomingCall.callerInfo
+
+ showIncomingCall(callerInfo,incomingCall);
+};
+callAgent.on('incomingCall', incomingCallHandler);
+```
+The method _showIncomingCall_ is a custom Contoso's method that will render a user interface to indicate incoming calls and two buttons to accept and decline the call. If you select accept button then the following code is used:
+
+```js
+// Accept the call
+var call = await incomingCall.accept();
+```
+
+If you select the decline button, then the following code is used:
++
+```js
+// Reject the call
+incomingCall.reject();
+```
+
+3. Megan start's a VoIP call to Teams user Alice: Megan uses her Teams desktop client to call Alice. The Teams infrastructure will ring all endpoints associated with Alice. It includes: Teams desktop, mobile, web clients, and applications based on Azure Communication Services calling SDK.
+4. Contoso's client application shows Megan's incoming call: Client application receives incoming call notification. _showIncomingCall_ method would use Graph API to translate the Teams user ID to display name.
+
+```
+GET https://graph.microsoft.com/v1.0/users/e8b753b5-4117-464e-9a08-713e1ff266b3
+Permissions: User.Read (delegated)
+Response: response.body.value.displayName; // ΓÇ¥Megan BowenΓÇ¥
+ response.body.value.id; // "e8b753b5-4117-464e-9a08-713e1ff266b3"
+```
+
+When the information is retrieved, the notification is shown to Alice in Contoso's client application.
+
+5. Alice accepts the call: Alice selects a button to accept the call and the connection between Alice and Megan is established.
+
+## Use case 5: Join Teams meeting
+This scenario is showing a multi-tenant use case, where company Contoso is providing SaaS to company Fabrikam. SaaS allows Fabrikam's users to join Teams meetings via a custom website that takes the identity of the Teams user.
+
+![Diagram is showing user experience of Alice joining Teams Meeting.](./media/custom-teams-endpoint/end-to-end-use-cases/cte-e2e-cte-to-meeting-overview.svg)
+
+The following sequence diagram shows detailed steps for joining a Teams meeting:
++
+### Steps
+1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
+2. Load Teams meetings and their identifiers: Contoso client application utilizes Graph API to get a list of Teams meetings for Fabrikam's users. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](https://docs.microsoft.com/graph/api/user-list-calendarview).
+
+```
+GET https://graph.microsoft.com/v1.0/me/calendar/calendarView?startDateTime={start_datetime}&endDateTime={end_datetime}
+Permissions: Calendars.Read (delegated)
+Response: response.body.value[0].subject; // ΓÇ¥Project TailspinΓÇ¥
+ response.body.value[0].onlineMeeting.joinUrl; // "https://teams.microsoft.com/l/meetup-join/..."
+ response.body.value[0].start.dateTime;
+ response.body.value[0].end.dateTime;
+ response.body.value[0].location.displayName;
+```
+
+Contoso's client application will then show the list of Teams meetings and the ability to join them.
+
+3. Join Teams meeting "Project Tailspin": Alice selects a button to join Teams meeting "Project Tailspin" in the Contoso's Client application. Client application uses Azure Communication Services calling SDK to provide the calling capability. Client applications create an instance of callAgent, that holds the Azure Communication Services access token acquired during first step.
+
+```js
+const callClient = new CallClient();
+tokenCredential = new AzureCommunicationTokenCredential('<AlICE_ACCESS_TOKEN>');
+callAgent = await callClient.createCallAgent(tokenCredential)
+```
+Then application joins a meeting via received joinUrl.
+
+```js
+var meetingLocator = new TeamsMeetingLinkLocator("https://teams.microsoft.com/l/meetup-join/...");
+callAgent.startCallJoinAsync(meetingLocator , new JoinCallOptions());
+```
+
+Alice then joins the Teams meeting.
+
+4. Other participants joining the Teams meeting: The provided experience is a standard Teams meeting. Based on the configuration and invites, the Teams meeting can be joined by Teams user, Teams anonymous user using Team web client, Teams desktop client, Teams mobile client, Azure Communication Services user via applications based on Communication Services calling SDK or users using phones.
+
+## Next steps
+
+The following articles might be of interest to you:
+
+- Learn more about [authentication](../authentication.md).
+- Try [quickstart for authentication of Teams users](../../quickstarts/manage-teams-identity.md).
+- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
communication-services Join Teams Meeting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/join-teams-meeting.md
> [!IMPORTANT] > BYOI interoperability is now generally available to all Communication Services applications and Teams organizations.
-Azure Communication Services can be used to build applications that enable users to join and participate in Teams meetings. [Standard ACS pricing](https://azure.microsoft.com/pricing/details/communication-services/) applies to these users, but there's no additional fee for the interoperability capability itself. With the bring your own identity (BYOI) model, you control user authentication and users of your applications don't need Teams licenses to join Teams meetings. This is ideal for business-to-consumer solutions that enable licensed Teams users (for example, healthcare providers or financial advisors) and external users (for example, patients or clients) using a custom application to join into a virtual consultation experience.
+Azure Communication Services can be used to build applications that enable users to join and participate in Teams meetings. [Standard ACS pricing](https://azure.microsoft.com/pricing/details/communication-services/) applies to these users, but there's no additional fee for the interoperability capability itself. With the bring your own identity (BYOI) model, you control user authentication and users of your applications don't need Teams licenses to join Teams meetings. This is ideal for applications that enable licensed Teams users and external users using a custom application to join into a virtual consultation experience. For example, healthcare providers using Teams can conduct teleheath virtual visits with their patients who use a custom application.
It's also possible to use Teams identities with the Azure Communication Services SDKs. More information is available [here](./teams-interop.md).
It's currently not possible for a Teams user to join a call that was initiated u
## Enabling anonymous meeting join in your Teams tenant
-When a BYOI user joins a Teams meeting, they're treated as an anonymous external user, similar to users that join a Teams meeting anonymously using the Teams web application. The ability for BYOI users to join Teams meetings as anonymous users is controlled by the existing "allow anonymous meeting join" configuration, which also controls the existing Teams anonymous meeting join. This setting can be updated in the [Teams admin center](https://admin.teams.microsoft.com/meetings/settings) or with the Teams PowerShell cmdlet [Set-CsTeamsMeetingConfiguration](/powershell/module/skype/set-csteamsmeetingconfiguration).
+When a BYOI user joins a Teams meeting, they're treated as an anonymous external user, similar to users that join a Teams meeting anonymously using the Teams web application. The ability for BYOI users to join Teams meetings as anonymous users is controlled by the existing "allow anonymous meeting join" configuration. This same configuration also controls the existing Teams anonymous meeting join. This setting can be updated in the [Teams admin center](https://admin.teams.microsoft.com/meetings/settings) or with the Teams PowerShell cmdlet [Set-CsTeamsMeetingConfiguration](/powershell/module/skype/set-csteamsmeetingconfiguration).
-Custom applications built with Azure Communication Services to connect and communicate with Teams users may be used by end users or by bots, and there is no differentiation in how they appear to Teams users unless the developer of the application explicitly indicates this as part of the communication. Your custom application should consider user authentication and other security measures to protect Teams meetings. Be mindful of the security implications of enabling anonymous users to join meetings, and use the [Teams security guide](/microsoftteams/teams-security-guide#addressing-threats-to-teams-meetings) to configure capabilities available to anonymous users.
+Custom applications built with Azure Communication Services to connect and communicate with Teams users can be used by end users or by bots, and there is no differentiation in how they appear to Teams users unless the developer of the application explicitly indicates this as part of the communication. Your custom application should consider user authentication and other security measures to protect Teams meetings. Be mindful of the security implications of enabling anonymous users to join meetings, and use the [Teams security guide](/microsoftteams/teams-security-guide#addressing-threats-to-teams-meetings) to configure capabilities available to anonymous users.
## Meeting experience
-As with Teams anonymous meeting join, your application must have the meeting link to join, which can be retrieved via the Graph API or from the calendar in Microsoft Teams. The name of BYOI users displayed in Teams is configurable via the Communication Services Calling SDK and they're labeled as ΓÇ£externalΓÇ¥ to let Teams users know they haven't been authenticated using Azure Active Directory. When the first ACS user joins a Teams meeting, the Teams client will display a message indicating that some features might not be available because one of the participants is using a custom client.
+As with Teams anonymous meeting join, your application must have the meeting link to join, which can be retrieved via the Graph API or from the calendar in Microsoft Teams. The name of BYOI users that is displayed in Teams is configurable via the Communication Services Calling SDK. They are labeled as ΓÇ£externalΓÇ¥ to let Teams users know they weren't authenticated using Azure Active Directory.
-A Communication Service user will not be admitted to a Teams meeting until there is at least one Teams user present in the meeting. Once a Teams user is present, then the Communication Services user will wait in the lobby until explicitly admitted by a Teams user, unless the "Who can bypass the lobby?" meeting policy/setting is set to "Everyone".
+A Communication Service user won't be admitted to a Teams meeting until there is at least one Teams user present in the meeting. Once a Teams user is present, then the Communication Services user will wait in the lobby until explicitly admitted by a Teams user, unless the "Who can bypass the lobby?" meeting policy/setting is set to "Everyone".
-During a meeting, Communication Services users will be able to use core audio, video, screen sharing, and chat functionality via Azure Communication Services SDKs. Once a Communication Services user leaves the meeting or the meeting ends, they can no longer send or receive new chat messages, but they will have access to messages sent and received during the meeting. Anonymous Communication Services users cannot add/remove participants to/from the meeting and they cannot start recording or transcription for the meeting.
+During a meeting, Communication Services users will be able to use core audio, video, screen sharing, and chat functionality via Azure Communication Services SDKs. Once a Communication Services user leaves the meeting or the meeting ends, they're no longer able to send or receive new chat messages, but they'll have access to messages sent and received during the meeting. Anonymous Communication Services users can't add/remove participants to/from the meeting nor can they start recording or transcription for the meeting.
Additional information on required dataflows for joining Teams meetings is available at the [client and server architecture page](client-and-server-architecture.md). The [Group Calling Hero Sample](../samples/calling-hero-sample.md) provides example code for joining a Teams meeting from a web application.
+## Diagnostics and call analytics
+After a Teams meeting ends, diagnostic information about the meeting is available using the [Communication Services logging and diagnostics](/azure/communication-services/concepts/logging-and-diagnostics) and using the [Teams Call Analytics](/MicrosoftTeams/use-call-analytics-to-troubleshoot-poor-call-quality) in the Teams admin center. Communication Services users will appear as "Anonymous" in Call Analytics screens. Communication Services users aren't included in the [Teams real-time Analytics](/microsoftteams/use-real-time-telemetry-to-troubleshoot-poor-meeting-quality).
+ ## Privacy Interoperability between Azure Communication Services and Microsoft Teams enables your applications and users to participate in Teams calls, meetings, and chat. It is your responsibility to ensure that the users of your application are notified when recording or transcription are enabled in a Teams call or meeting.
Microsoft will indicate to you via the Azure Communication Services API that rec
## Limitations and known issues -- Communication Services users may join a Teams meeting that is scheduled for a Teams channel and use audio and video, but they will not be able to send or receive any chat messages, since they are not members of the channel.-- Communication Services users may join a Teams webinar, but the presenter and attendee roles are not currently enforced, thus Communication Services users could perform actions not intended for attendees, such as screen sharing, turning their camera on/off, or unmuting themselves, if your application provides UX for those actions.
+- Communication Services users can join a Teams meeting that is scheduled for a Teams channel and use audio and video, but they won't be able to send or receive any chat messages because they aren't members of the channel.
+- Communication Services users may join a Teams webinar, but the presenter and attendee roles aren't currently enforced, thus Communication Services users could perform actions not intended for attendees, such as screen sharing, turning their camera on/off, or unmuting themselves, if your application provides UX for those actions.
- When using Microsoft Graph to [list the participants in a Teams meeting](/graph/api/call-list-participants), details for Communication Services users are not currently included.-- PowerPoint presentations are not rendered for Communication Services users.
+- PowerPoint presentations aren't rendered for Communication Services users.
- Teams meetings support up to 1000 participants, but the Azure Communication Services Calling SDK currently only supports 350 participants and Chat SDK supports 250 participants. - With [Cloud Video Interop for Microsoft Teams](/microsoftteams/cloud-video-interop), some devices have seen issues when a Communication Services user shares their screen.-- [Communication Services voice and video calling events](../../event-grid/communication-services-voice-video-events.md) are not raised for Teams meeting.
+- [Communication Services voice and video calling events](/azure/event-grid/communication-services-voice-video-events) aren't raised for Teams meeting.
- Features such as reactions, raised hand, together mode, and breakout rooms are only available for Teams users.-- Communication Services users cannot interact with poll or Q&A apps in meetings.-- Communication Services won't have access to all chat features supported by Teams. They can send and receive text messages, use typing indicators, read receipts and other features supported by Chat SDK. However features like file sharing, reply or react to a message are not supported for Communication Services users. -- The Calling SDK does not currently support closed captions for Teams meetings.-- Communication Services users are not included in the [Real-Time Analytics](/microsoftteams/use-real-time-telemetry-to-troubleshoot-poor-meeting-quality).-- Communication Services users cannot join [Teams live events](/microsoftteams/teams-live-events/what-are-teams-live-events).-- [Teams activity handler events](/microsoftteams/platform/bots/bot-basics?tabs=csharp) for bots do not fire when Communication Services users join a Teams meeting.
+- Communication Services users can't interact with poll or Q&A apps in meetings.
+- Communication Services won't have access to all chat features supported by Teams. They can send and receive text messages, use typing indicators, read receipts and other features supported by Chat SDK. However features like file sharing, reply or react to a message aren't supported for Communication Services users.
+- The Calling SDK doesn't currently support closed captions for Teams meetings.
+- Communication Services users can't join [Teams live events](/microsoftteams/teams-live-events/what-are-teams-live-events).
+- [Teams activity handler events](/microsoftteams/platform/bots/bot-basics?tabs=csharp) for bots don't fire when Communication Services users join a Teams meeting.
## Next steps
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/router/concepts.md
Azure Communication Services Job Router solves the problem of matching supply with demand.
-A real-world example of this may be call center agents (supply) being matched to incoming support calls (demand).
+A real-world example of this is matching call center agents (supply) to incoming support calls (demand).
## Job
-A Job represents a unit of work (demand), which needs to be routed to an available Worker (supply).
+A Job is a unit of work (demand), which must be routed to an available Worker (supply).
-A real-world example of this may be an incoming call or chat in the context of a call center.
+A real-world example is an incoming call or chat in the context of a call center.
### Job submission flow 1. Your application submits a Job via the Job Router SDK.
-2. The Job is classified and a [JobClassified Event][job_classified_event] is sent via EventGrid, which includes all the information about the Job and how the classification process may have modified its properties.
-
- :::image type="content" source="../media/router/acs-router-job-submission.png" alt-text="Diagram showing Communication Services' Job Router submitting a job.":::
+2. The Job is classified and a [JobClassified Event][job_classified_event] is sent via Event Grid.
+
+ :::image type="content" source="../media/router/acs-router-job-submission.png" alt-text="Diagram of job submission.":::
## Worker
-A Worker represents the supply available to handle a Job. Each worker registers with one or more queues to receive jobs.
+A Worker is the supply available to handle a Job. Each worker registers with one or more queues to receive jobs.
-A real-world example of this may be an agent working in a call center.
+A real-world example is an agent in a call center.
### Worker registration flow 1. When your Worker is ready to take on work, you can register the worker via the Job Router SDK. 2. Job Router then sends a [WorkerRegistered Event][worker_registered_event]
- :::image type="content" source="../media/router/acs-router-worker-registration.png" alt-text="Diagram showing Communication Services' Job Router worker registration.":::
+ :::image type="content" source="../media/router/acs-router-worker-registration.png" alt-text="Diagram of worker registration.":::
## Queue
-A Queue represents an ordered list of jobs waiting to be served by a worker. Workers will register with a queue to receive work from it.
+A Queue is an ordered list of jobs, that are waiting to be served by a worker. Workers register with a queue to receive work from it.
-A real-world example of this may be a call queue in a call center.
+A real-world example is a call queue in a call center.
## Channel
-A Channel represents a grouping of jobs by some type. When a worker registers to receive work, they must also specify for which channels they can handle work, and how much of each can they handle concurrently. Channels are just a string discriminator and aren't explicitly created.
+A Channel is a grouping of jobs by some type. When a worker registers to receive work, they must also specify for which channels they can handle work, and how much of each can they handle concurrently. Channels are just a string discriminator and aren't explicitly created.
-A real-world example of this may be `voice calls` or `chats` in a call center.
+Real-world examples are `voice calls` or `chats` in a call center.
## Offer
-An Offer is extended by JobRouter to a worker to handle a particular job when it determines a match. When this happens, you'll be notified via [EventGrid][subscribe_events]. You can either accept or decline the offer using the JobRouter SDK, or it will expire according to the time to live configured on the Distribution Policy.
+An Offer is extended by Job Router to a worker to handle a particular job when it determines a match. You can either accept or decline the offer with the JobRouter SDK. If you ignore the offer, it expires according to the time to live configured on the Distribution Policy.
-A real-world example of this may be the ringing of an agent in a call center.
+A real-world example is the ringing of an agent in a call center.
### Offer flow
-1. When Job Router finds a matching Worker for a Job, it offers the work by sending a [OfferIssued Event][offer_issued_event] via EventGrid.
+1. When Job Router finds a matching Worker for a Job, it creates an Offer and sends an [OfferIssued Event][offer_issued_event] via [Event Grid][subscribe_events].
2. The Offer is accepted via the Job Router API.
-3. Job Router sends a [OfferAccepted Event][offer_accepted_event] signifying to the Contoso Application the Worker is assigned to the Job.
+3. Job Router sends an [OfferAccepted Event][offer_accepted_event].
:::image type="content" source="../media/router/acs-router-accept-offer.png" alt-text="Diagram showing Communication Services' Job Router accept offer."::: ## Distribution Policy
-A Distribution Policy represents a configuration set that controls how jobs in a queue are distributed to workers registered with that queue.
+A Distribution Policy is a configuration set that controls how jobs in a queue are distributed to workers registered with that queue.
This configuration includes: - How long an Offer is valid before it expires.
This configuration includes:
### Distribution modes
-The 3 types of modes are
+The three types of modes are
- **Round Robin**: Workers are ordered by `Id` and the next worker after the previous one that got an offer is picked. - **Longest Idle**: The worker that has not been working on a job for the longest.-- **Best Worker**: The workers that are best able to handle the job will be picked first. The logic to determine this can be optionally customized by specifying an expression or azure function to compare 2 workers and determine which one to pick.
+- **Best Worker**: The workers that are best able to handle the job are picked first. The logic to rank Workers can be customized, with an expression or Azure function to compare two workers.
## Labels
-You can attach labels to workers, jobs, and queues. These are key value pairs that can be of `string`, `number` or `boolean` data types.
+You can attach labels to workers, jobs, and queues. Labels are key value pairs that can be of `string`, `number`, or `boolean` data types.
-A real-world example of this may be the skill level of a particular worker or the team or geographic location.
+A real-world example is the skill level of a particular worker or the team or geographic location.
## Label selectors
-Label selectors can be attached to a job in order to target a subset of workers serving the queue.
+Label selectors can be attached to a job in order to target a subset of workers on the queue.
-A real-world example of this may be a condition on an incoming call that the agent must have a minimum level of knowledge of a particular product.
+A real-world example is a condition on an incoming call that the agent must have a minimum level of knowledge of a particular product.
## Classification policy
-A classification policy can be used to dynamically select a queue, determine job priority and attach worker label selectors to a job by leveraging a rules engine.
+A classification policy can be used to programmatically select a queue, determine job priority, or attach worker label selectors to a job.
## Exception policy
An exception policy controls the behavior of a Job based on a trigger and execut
- [Exception Policies](exception-policy.md) - [Quickstart guide](../../quickstarts/router/get-started-router.md) - [Manage queues](../../how-tos/router-sdk/manage-queue.md)-- [Classifying a Job](../../how-tos/router-sdk/job-classification.md)
+- [How to classify a Job](../../how-tos/router-sdk/job-classification.md)
+- [Target a preferred worker](../../how-tos/router-sdk/preferred-worker.md)
- [Escalate a Job](../../how-tos/router-sdk/escalate-job.md) - [Subscribe to events](../../how-tos/router-sdk/subscribe-events.md)
communication-services Direct Routing Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony/direct-routing-infrastructure.md
Title: Azure direct routing infrastructure requirements - Azure Communication Services
+ Title: Azure direct routing infrastructure requirements ΓÇö Azure Communication Services
description: Familiarize yourself with the infrastructure requirements for Azure Communication Services direct routing configuration
[!INCLUDE [Public Preview](../../includes/public-preview-include-document.md)]
-This article describes infrastructure, licensing, and Session Border Controller (SBC) connectivity details that you'll want to keep in mind as your plan your Azure direct routing deployment.
+This article describes infrastructure, licensing, and Session Border Controller (SBC) connectivity details that you want to keep in mind as your plan your Azure direct routing deployment.
## Infrastructure requirements
The infrastructure requirements for the supported SBCs, domains, and other netwo
|Infrastructure requirement|You need the following| |: |: | |Session Border Controller (SBC)|A supported SBC. For more information, see [Supported SBCs](#supported-session-border-controllers-sbcs).|
-|Telephony trunks connected to the SBC|One or more telephony trunks connected to the SBC. On one end, the SBC connects to the Azure Communication Service via direct routing. The SBC can also connect to third-party telephony entities, such as PBXs, Analog Telephony Adapters, and so on. Any PSTN connectivity option connected to the SBC will work. (For configuration of the PSTN trunks to the SBC, refer to the SBC vendors or trunk providers.)|
+|Telephony trunks connected to the SBC|One or more telephony trunks connected to the SBC. On one end, the SBC connects to the Azure Communication Service via direct routing. The SBC can also connect to third-party telephony entities, such as PBXs, Analog Telephony Adapters. Any Public Switched Telephony Network (PSTN) connectivity option connected to the SBC works. (For configuration of the PSTN trunks to the SBC, refer to the SBC vendors or trunk providers.)|
|Azure subscription|An Azure subscription that you use to create Communication Services resource, and the configuration and connection to the SBC.| |Communication Services Access Token|To make calls, you need a valid Access Token with `voip` scope. See [Access Tokens](../identity-model.md#access-tokens)| |Public IP address for the SBC|A public IP address that can be used to connect to the SBC. Based on the type of SBC, the SBC can use NAT.|
-|Fully Qualified Domain Name (FQDN) for the SBC|An FQDN for the SBC, where the domain portion of the FQDN does not match registered domains in your Microsoft 365 or Office 365 organization. For more information, see [SBC domain names](#sbc-domain-names).|
-|Public DNS entry for the SBC |A public DNS entry mapping the SBC FQDN to the public IP Address. |
-|Public trusted certificate for the SBC |A certificate for the SBC to be used for all communication with Azure direct routing. For more information, see [Public trusted certificate for the SBC](#public-trusted-certificate-for-the-sbc).|
+|Fully Qualified Domain Name (FQDN) for the SBC|An FQDN for the SBC, where the domain portion of the FQDN doesnΓÇÖt match registered domains in your Microsoft 365 or Office 365 organization. For more information, see [SBC certificates and domain names](#sbc-certificates-and-domain-names).|
+|Public DNS entry for the SBC |A public DNS entry mapping the SBC FQDN to the public IP address. |
+|Public trusted certificate for the SBC |A certificate for the SBC to be used for all communication with Azure direct routing. For more information, see [SBC certificates and domain names](#sbc-certificates-and-domain-names).|
|Firewall IP addresses and ports for SIP signaling and media |The SBC communicates to the following services in the cloud:<br/><br/>SIP Proxy, which handles the signaling<br/>Media Processor, which handles media<br/><br/>These two services have separate IP addresses in Microsoft Cloud, described later in this document.
-## SBC domain names
+## SBC certificates and domain names
-Customers without Office 365 can use any domain name for which they can obtain a public certificate.
+Microsoft recommends that you request the certificate for the SBC by a certification signing request (CSR). For specific instructions on how to generate a CSR for an SBC, refer to the interconnection instructions or documentation provided by your SBC vendors.
-The following table shows examples of DNS names registered for the tenant, whether the name can be used as a fully qualified domain name (FQDN) for the SBC, and examples of valid FQDN names:
+ >[!NOTE]
+ > Most Certificate Authorities (CAs) require the private key size to be at least 2048. Keep this in mind when you generate the CSR.
-|DNS name|Can be used for SBC FQDN|Examples of FQDN names|
-|: |: |: |
-contoso.com|Yes|**Valid names:**<br/>sbc1.contoso.com<br/>ssbcs15.contoso.com<br/>europe.contoso.com|
-|contoso.onmicrosoft.com|No|Using *.onmicrosoft.com domains is not supported for SBC names
+The certificate must have the SBC FQDN as the common name (CN) or the subject alternative name (SAN) field. The certificate should be issued directly from a certification authority, not an intermediate provider.
-If you are an Office 365 customer, then the SBC domain name must not match registered in Domains of the Office 365 tenant. Below is the example of Office 365 and Azure Communication Service coexistence:
+Alternatively, Communication Services direct routing supports a wildcard in the CN and/or SAN, and the wildcard must conform to standard [RFC HTTP Over TLS](https://tools.ietf.org/html/rfc2818#section-3.1).
-|Domain registered in Office 365|Examples of SBC FQDN in Teams|Examples of SBC FQDN names in Azure Communication Services|
-|: |: |: |
-**contoso.com** (second level domain)|**sbc.contoso.com** (name in the second level domain)|**sbc.acs.contoso.com** (name in the third level domain)<br/>**sbc.fabrikam.com** (any name within different domain)|
-|**o365.contoso.com** (third level domain)|**sbc.o365.contoso.com** (name in the third level domain)|**sbc.contoso.com** (name in the second level domain)<br/>**sbc.acs.o365.contoso.com** (name in the fourth level domain)<br/>**sbc.fabrikam.com** (any name within different domain)
+Customers who already use Office 365 and have a domain registered in Microsoft 365 Admin Center can use SBC FQDN from the same domain.
+Domains that arenΓÇÖt previously used in O365 must be provisioned.
-SBC pairing works on the Communication Services resource level, meaning you can pair many SBCs to a single Communication Services resource, but you cannot pair a single SBC to more than one Communication Services resource. Unique SBC FQDNs are required for pairing to different resources.
-
-## Public trusted certificate for the SBC
-
-Microsoft recommends that you request the certificate for the SBC by generating a certification signing request (CSR). For specific instructions on generating a CSR for an SBC, refer to the interconnection instructions or documentation provided by your SBC vendors.
+An example would be using `\*.contoso.com`, which would match the SBC FQDN `sbc.contoso.com`, but wouldn't match with `sbc.test.contoso.com`.
- > [!NOTE]
- > Most Certificate Authorities (CAs) require the private key size to be at least 2048. Keep this in mind when generating the CSR.
+>[!IMPORTANT]
+>During Public Preview only: if you plan to use a wildcard certificate for the domain that is not registered in Teams, please raise a support ticket, and our team will add it as a trusted domain.
-The certificate needs to have the SBC FQDN as the common name (CN) or the subject alternative name (SAN) field. The certificate should be issued directly from a certification authority, not from an intermediate provider.
+Communication Services only trusts certificates signed by Certificate Authorities (CAs) that are part of the Microsoft Trusted Root Certificate Program. Ensure that your SBC certificate is signed by a CA that is part of the program, and that Extended Key Usage (EKU) extension of your certificate includes Server Authentication.
+Learn more:
-Alternatively, Communication Services direct routing supports a wildcard in the CN and/or SAN, and the wildcard needs to conform to standard [RFC HTTP Over TLS](https://tools.ietf.org/html/rfc2818#section-3.1).
+[Program Requirements ΓÇö Microsoft Trusted Root Program](/security/trusted-root/program-requirements)
+
+[Included CA Certificate List](https://ccadb-public.secure.force.com/microsoft/IncludedCACertificateReportForMSFT)
-An example would be using `\*.contoso.com`, which would match the SBC FQDN `sbc.contoso.com`, but wouldn't match with `sbc.test.contoso.com`.
+SBC pairing works on the Communication Services resource level. It means you can pair many SBCs to a single Communication Services resource. Still, you cannot pair a single SBC to more than one Communication Services resource. Unique SBC FQDNs are required for pairing to different resources.
-The certificate needs to be generated by one of the following root certificate authorities:
--- AffirmTrust-- AddTrust External CA Root-- Baltimore CyberTrust Root*-- Buypass-- Cybertrust-- Class 3 Public Primary Certification Authority-- Comodo Secure Root CA-- Deutsche Telekom -- DigiCert Global Root CA-- DigiCert High Assurance EV Root CA-- Entrust-- GlobalSign-- Go Daddy-- GeoTrust-- Verisign, Inc. -- SSL.com-- Starfield-- Symantec Enterprise Mobile Root for Microsoft -- SwissSign-- Thawte Timestamping CA-- Trustwave-- TeliaSonera -- T-Systems International GmbH (Deutsche Telekom)-- QuoVadis-- USERTrust RSA Certification Authority-- Hongkong Post Root CA 1,2,3-- Sectigo Root CA-
-Microsoft is working on adding more certification authorities based on customer requests.
## SIP Signaling: FQDNs The connection points for Communication Services direct routing are the following three FQDNs: -- **sip.pstnhub.microsoft.com** ΓÇô Global FQDN ΓÇô must be tried first. When the SBC sends a request to resolve this name, the Microsoft Azure DNS servers return an IP address pointing to the primary Azure datacenter assigned to the SBC. The assignment is based on performance metrics of the datacenters and geographical proximity to the SBC. The IP address returned corresponds to the primary FQDN.-- **sip2.pstnhub.microsoft.com** ΓÇô Secondary FQDN ΓÇô geographically maps to the second priority region.-- **sip3.pstnhub.microsoft.com** ΓÇô Tertiary FQDN ΓÇô geographically maps to the third priority region.
+- **sip.pstnhub.microsoft.com ΓÇö Global FQDN ΓÇö must be tried first. When the SBC sends a request to resolve this name, the Microsoft Azure DNS servers return an IP address that points to the primary Azure datacenter assigned to the SBC. The assignment is based on performance metrics of the datacenters and geographical proximity to the SBC. The IP address returned corresponds to the primary FQDN.
+- **sip2.pstnhub.microsoft.com ΓÇö Secondary FQDN ΓÇö geographically maps to the second priority region.
+- **sip3.pstnhub.microsoft.com ΓÇö Tertiary FQDN ΓÇö geographically maps to the third priority region.
-Placing these three FQDNs in order is required to:
+These three FQDNs in order are required to:
- Provide optimal experience (less loaded and closest to the SBC datacenter assigned by querying the first FQDN).-- Provide failover when connection from an SBC is established to a datacenter that is experiencing a temporary issue. For more information, see [Failover mechanism](#failover-mechanism-for-sip-signaling) below.
+- Provide failover when connection from an SBC is established to a datacenter that is experiencing a temporary issue. For more information, see [Failover mechanism](#failover-mechanism-for-sip-signaling).
-The FQDNs ΓÇô sip.pstnhub.microsoft.com, sip2.pstnhub.microsoft.com, and sip3.pstnhub.microsoft.com ΓÇô will be resolved to one of the following IP addresses:
+The FQDNs ΓÇö sip.pstnhub.microsoft.com, sip2.pstnhub.microsoft.com, and sip3.pstnhub.microsoft.com ΓÇö resolve to one of the following IP addresses:
- `52.112.0.0/14 (IP addresses from 52.112.0.1 to 52.115.255.254)` - `52.120.0.0/14 (IP addresses from 52.120.0.1 to 52.123.255.254)`
Use the following ports for Communication Services Azure direct routing:
|Traffic|From|To|Source port|Destination port| |: |: |: |: |: |
-|SIP/TLS|SIP Proxy|SBC|1024 ΓÇô 65535|Defined on the SBC (For Office 365 GCC High/DoD only port 5061 must be used)|
+|SIP/TLS|SIP Proxy|SBC|1024ΓÇô65535|Defined on the SBC (For Office 365 GCC High/DoD only port 5061 must be used)|
SIP/TLS|SBC|SIP Proxy|Defined on the SBC|5061| ### Failover mechanism for SIP Signaling
-The SBC makes a DNS query to resolve sip.pstnhub.microsoft.com. Based on the SBC location and the datacenter performance metrics, the primary datacenter is selected. If the primary datacenter experiences an issue, the SBC will try the sip2.pstnhub.microsoft.com, which resolves to the second assigned datacenter, and, in the rare case that datacenters in two regions are not available, the SBC retries the last FQDN (sip3.pstnhub.microsoft.com), which provides the tertiary datacenter IP.
+The SBC makes a DNS query to resolve sip.pstnhub.microsoft.com. Based on the SBC location and the datacenter performance metrics, the primary datacenter is selected. If the primary datacenter experiences an issue, the SBC tries the sip2.pstnhub.microsoft.com, which resolves to the second assigned datacenter, and, in the rare case that datacenters in two regions arenΓÇÖt available, the SBC retries the last FQDN (sip3.pstnhub.microsoft.com), which provides the tertiary datacenter IP.
## Media traffic: IP and Port ranges
The port range of the Media Processors is shown in the following table:
|Traffic|From|To|Source port|Destination port| |: |: |: |: |: |
-|UDP/SRTP|Media Processor|SBC|3478-3481 and 49152 ΓÇô 53247|Defined on the SBC|
-|UDP/SRTP|SBC|Media Processor|Defined on the SBC|3478-3481 and 49152 ΓÇô 53247|
+|UDP/SRTP|Media Processor|SBC|3478ΓÇô3481 and 49152ΓÇô53247|Defined on the SBC|
+|UDP/SRTP|SBC|Media Processor|Defined on the SBC|3478ΓÇô3481 and 49152ΓÇô53247|
> [!NOTE] > Microsoft recommends at least two ports per concurrent call on the SBC.
You can force use of the specific codec on the Session Border Controller by excl
### Leg between Communication Services Calling SDK app and Cloud Media Processor
-On the leg between the Cloud Media Processor and Communication Services Calling SDK app, G.722 is used. Microsoft is working on adding more codecs on this leg.
+On the leg between the Cloud Media Processor and Communication Services Calling SDK app, G.722 is used. Work on adding more codecs on this leg is in progress.
## Supported Session Border Controllers (SBCs)
communication-services Preferred Worker https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/how-tos/router-sdk/preferred-worker.md
+
+ Title: Target a Preferred Worker
+
+description: Use Azure Communication Services SDKs to target a job to a specific worker
++++ Last updated : 01/31/2022+
+zone_pivot_groups: acs-js-csharp
+
+#Customer intent: As a developer, I want to target a specific worker
++
+# Target a Preferred Worker
+
+In the context of a call center, customers might be assigned an account manager or have a relationship with a specific worker. As such, You'd want to route a specific job to a specific worker if possible.
++
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A deployed Communication Services resource. [Create a Communication Services resource](../../quickstarts/create-communication-resource.md).
+- Optional: Complete the quickstart to [get started with Job Router](../../quickstarts/router/get-started-router.md)
+
+## Setup worker selectors
+
+Every worker automatically has an `Id` label. You can apply worker selectors to the job, to target a specific worker.
+
+In the following example, a job is created that targets a specific worker. If that worker does not accept the job within the TTL of 1 minute, the condition for the specific worker is no longer be valid and the job could go to any worker.
++
+```csharp
+await client.CreateJobAsync(
+ channelId: "<channel id>",
+ queueId: "<queue id>",
+ workerSelectors: new List<LabelSelector>
+ {
+ new LabelSelector(
+ key: "Id",
+ @operator: LabelOperator.Equal,
+ value: "<preferred worker id>",
+ ttl: TimeSpan.FromMinutes(1))
+ });
+```
+++
+```typescript
+await client.createJob({
+ channelId: "<channel id>",
+ queueId: "<queue id>",
+ workerSelectors: [
+ {
+ key: "Id",
+ operator: "equal",
+ value: "<preferred worker id>",
+ ttl: "00:01:00"
+ }
+ ]
+});
+```
++
+> [!TIP]
+> You could also use any custom label that is unique to each worker.
container-apps Background Processing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/background-processing.md
# Tutorial: Deploy a background processing application with Azure Container Apps Preview
-Azure Container Apps allows you to deploy applications without requiring the exposure of public endpoints. In this tutorial, you deploy a sample application that reads messages from an Azure Storage Queue and logs the messages in Azure log Analytics workspace. Using Container Apps scale rules, the application can scale up and down based on the Azure Storage queue length. When there are no messages on the queue, the container app scales down to zero.
+Using Azure Container Apps allows you to deploy applications without requiring the exposure of public endpoints. By using Container Apps scale rules, the application can scale up and down based on the Azure Storage queue length. When there are no messages on the queue, the container app scales down to zero.
You learn how to: > [!div class="checklist"]+ > * Create a Container Apps environment to deploy your container apps > * Create an Azure Storage Queue to send messages to the container app > * Deploy your background processing application as a container app > * Verify that the queue messages are processed by the container app
-## Prerequisites
-
-The following items are required to complete this tutorial:
-
-* **Azure CLI**: You must have Azure CLI version 2.29.0 or later installed on your local computer.
- * Run `az --version` to find the version. If you need to install or upgrade, see [Install the Azure CLI](/cli/azure/install-azure-cli).
-
-## Setup
-
-This tutorial makes use of the following environment variables:
-
-# [Bash](#tab/bash)
-
-```bash
-RESOURCE_GROUP="my-containerapps"
-LOCATION="canadacentral"
-CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$RESOURCE_GROUP="my-containerapps"
-$LOCATION="canadacentral"
-$CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-$LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-```
---
-Create a variable for your storage account name.
-
-# [Bash](#tab/bash)
-
-```bash
-STORAGE_ACCOUNT="<MY_STORAGE_ACCOUNT_NAME>"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$STORAGE_ACCOUNT="<storage account name>"
-```
---
-Replace the `<storage account name>` placeholder with your own value before you run this snippet. Storage account names must be unique within Azure, be between 3 and 24 characters in length, and may contain numbers or lowercase letters only. The storage account will be created in a following step.
-
-Next, sign in to Azure from the CLI.
-
-Run the following command, and follow the prompts to complete the authentication process.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az login
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az login
-```
---
-To ensure you're running the latest version of the CLI, use the `upgrade` command.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az upgrade
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az upgrade
-```
---
-Next, install the Azure Container Apps extension to the CLI.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az extension add \
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az extension add `
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
---
-Now that the extension is installed, register the `Microsoft.Web` namespace.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az provider register --namespace Microsoft.Web
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az provider register --namespace Microsoft.Web
-```
---
-You'll use a resource group to organize the services related to your new container app. Create the group with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az group create \
- --name $RESOURCE_GROUP \
- --location "$LOCATION"
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az group create `
- --name $RESOURCE_GROUP `
- --location $LOCATION
-```
---
-With the CLI upgraded and a new resource group available, you can create a Container Apps environment and deploy your container app.
-
-## Create an environment
-
-Azure Container Apps environments act as secure boundary around a group of container apps. Different container apps in the same environment are deployed in the same virtual network and write logs to the same Log Analytics workspace.
-
-Azure Log Analytics is used to monitor your container app required when creating a Container Apps environment.
-
-Create a new Log Analytics workspace with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az monitor log-analytics workspace create \
- --resource-group $RESOURCE_GROUP \
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az monitor log-analytics workspace create `
- --resource-group $RESOURCE_GROUP `
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
---
-Next, retrieve the Log Analytics Client ID and client secret. Make sure to run each query separately to give enough time for the request to complete.
-
-# [Bash](#tab/bash)
-
-```azurecli
-LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"'`
-```
-
-```azurecli
-LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"'`
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"')
-```
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out json | tr -d '"'`
-```
az containerapp env create `
## Set up a storage queue
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure* and be from 3 to 24 characters in length containing numbers and lowercase letters only.
+
+# [Bash](#tab/bash)
+
+```bash
+STORAGE_ACCOUNT="<storage account name>"
+```
+
+# [PowerShell](#tab/powershell)
+
+```powershell
+$STORAGE_ACCOUNT="<storage account name>"
+```
+++ Create an Azure Storage account. # [Bash](#tab/bash)
az storage account create \
# [PowerShell](#tab/powershell)
-```azurecli
-az storage account create `
- --name $STORAGE_ACCOUNT `
- --resource-group $RESOURCE_GROUP `
- --location $LOCATION `
- --sku Standard_RAGRS `
- --kind StorageV2
+```powershell
+$STORAGE_ACCOUNT = New-AzStorageAccount `
+ -Name $STORAGE_ACCOUNT_NAME `
+ -ResourceGroupName $RESOURCE_GROUP `
+ -Location $LOCATION `
+ -SkuName Standard_RAGRS `
+ -Kind StorageV2
```
-Next, get the queue's connection string.
+Next, get the connection string for the queue.
# [Bash](#tab/bash)
QUEUE_CONNECTION_STRING=`az storage account show-connection-string -g $RESOURCE_
# [PowerShell](#tab/powershell) ```powershell
-$QUEUE_CONNECTION_STRING=(az storage account show-connection-string -g $RESOURCE_GROUP --name $STORAGE_ACCOUNT --query connectionString --out json | tr -d '"')
+ $QUEUE_CONNECTION_STRING=(az storage account show-connection-string -g $RESOURCE_GROUP --name $STORAGE_ACCOUNT_NAME --query connectionString --out json) -replace '"',''
```
az storage queue create \
# [PowerShell](#tab/powershell)
-```azurecli
-az storage queue create `
- --name "myqueue" `
- --account-name $STORAGE_ACCOUNT `
- --connection-string $QUEUE_CONNECTION_STRING
+```powershell
+$queue = New-AzStorageQueue ΓÇôName "myqueue" `
+ -Context $STORAGE_ACCOUNT.Context
```
az storage message put \
# [PowerShell](#tab/powershell) ```azurecli
-az storage message put `
- --content "Hello Queue Reader App" `
- --queue-name "myqueue" `
- --connection-string $QUEUE_CONNECTION_STRING
+$queueMessage = [Microsoft.Azure.Storage.Queue.CloudQueueMessage]::new("Hello Queue Reader App")
+$queue.CloudQueue.AddMessageAsync($QueueMessage)
```
az deployment group create --resource-group "$RESOURCE_GROUP" \
# [PowerShell](#tab/powershell)
-```azurecli
-az deployment group create --resource-group "$RESOURCE_GROUP" `
- --template-file ./queue.json `
- --parameters `
- environment_name="$CONTAINERAPPS_ENVIRONMENT" `
- queueconnection="$QUEUE_CONNECTION_STRING" `
- location="$LOCATION"
+```powershell
+$params = @{
+ environment_name = $CONTAINERAPPS_ENVIRONMENT
+ location = $LOCATION
+ queueconnection=$QUEUE_CONNECTION_STRING
+}
+
+New-AzResourceGroupDeployment `
+ -ResourceGroupName $RESOURCE_GROUP `
+ -TemplateParameterObject $params `
+ -TemplateFile ./queue.json `
+ -SkipTemplateParameterPrompt
```
The application scales up to 10 replicas based on the queue length as defined in
## Verify the result
-The container app running as a background process creates logs entries in Log analytics as messages arrive from Azure Storage Queue. You may need to wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
+The container app runs as a background process. As messages arrive from the Azure Storage Queue, the application creates log entries in Log analytics. You must wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
Run the following command to see logged messages. This command requires the Log analytics extension, so accept the prompt to install extension when requested.
az monitor log-analytics query \
# [PowerShell](#tab/powershell)
-```azurecli
-az monitor log-analytics query `
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'queuereader' and Log_s contains 'Message ID'" `
- --out table
+```powershell
+$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'queuereader' and Log_s contains 'Message ID'"
+$queryResults.Results
```
az monitor log-analytics query `
## Clean up resources
-Once you are done, clean up your Container Apps resources by running the following command to delete your resource group.
+Once you are done, run the following command to delete the resource group that contains your Container Apps resources.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --resource-group $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
container-apps Get Started Existing Container Image https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/get-started-existing-container-image.md
Title: 'Quickstart: Deploy an existing container image using the Azure CLI'
+ Title: 'Quickstart: Deploy an existing container image with the Azure CLI'
description: Deploy an existing container image to Azure Container Apps Preview with the Azure CLI.
zone_pivot_groups: container-apps-registry-types
# Quickstart: Deploy an existing container image with the Azure CLI
-Azure Container Apps Preview enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while leaving behind the concerns of manually configuring cloud infrastructure and complex container orchestrators.
+The Azure Container Apps Preview service enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while you leave behind the concerns of manual cloud infrastructure configuration and complex container orchestrators.
This article demonstrates how to deploy an existing container to Azure Container Apps.
This article demonstrates how to deploy an existing container to Azure Container
## Prerequisites -- Azure account with an active subscription.
+- An Azure account with an active subscription.
- If you don't have one, you [can create one for free](https://azure.microsoft.com/free/). - Install the [Azure CLI](/cli/azure/install-azure-cli).
az containerapp env create \
--resource-group $RESOURCE_GROUP \ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
+ --location $LOCATION
``` # [PowerShell](#tab/powershell)
az containerapp env create `
--resource-group $RESOURCE_GROUP ` --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID ` --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
+ --location $LOCATION
``` ## Create a container app
-Now that you have an environment created, you can deploy your first container app. Using the `containerapp create` command, deploy a container image to Azure Container Apps.
+Now that you have an environment created, you can deploy your first container app. With the `containerapp create` command, deploy a container image to Azure Container Apps.
-The example shown in this article demonstrates how to use a custom container image with common commands. Your container image may need more parameters including the following items:
+The example shown in this article demonstrates how to use a custom container image with common commands. Your container image might need more parameters for the following items:
-- Setting the revision mode-- Defining secrets-- Defining environment variables-- Setting container CPU or memory requirements-- Enabling and configuring Dapr-- Enabling internal or internal ingress-- Providing minimum and maximum replica values or scale rules
+- Set the revision mode
+- Define secrets
+- Define environment variables
+- Set container CPU or memory requirements
+- Enable and configure Dapr
+- Enable internal or internal ingress
+- Provide minimum and maximum replica values or scale rules
For details on how to provide values for any of these parameters to the `create` command, run `az containerapp create --help`.
Before you run this command, replace `<REGISTRY_CONTAINER_URL>` with the URL to
::: zone-end
-If you have enabled ingress on your container app, you can add `--query configuration.ingress.fqdn` to the `create` command to return the app's public URL.
+If you have enabled ingress on your container app, you can add `--query configuration.ingress.fqdn` to the `create` command to return the public URL for the application.
## Verify deployment
-To verify a successful deployment, you can query the Log Analytics workspace. You may need to wait a 5 to 10 minutes for the analytics to arrive for the first time before you are able to query the logs.
+To verify a successful deployment, you can query the Log Analytics workspace. You might have to wait 5ΓÇô10 minutes after deployment for the analytics to arrive for the first time before you are able to query the logs.
-After about 5 to 10 minutes has passed after creating the container app, use the following steps to view logged messages.
+After about 5-10 minutes has passed, use the following steps to view logged messages.
# [Bash](#tab/bash)
az monitor log-analytics query \
# [PowerShell](#tab/powershell)
-```azurecli
-az monitor log-analytics query `
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'my-container-app' | project ContainerAppName_s, Log_s, TimeGenerated" `
+```powershell
+$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5"
+$queryResults.Results
--out table ```
az monitor log-analytics query `
## Clean up resources
-If you're not going to continue to use this application, you can delete the Azure Container Apps instance and all the associated services by removing the resource group.
+If you're not going to continue to use this application, run the following command to delete the resource group along with all the resources created in this quickstart.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --name $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
container-apps Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/get-started.md
# Quickstart: Deploy your first container app
-Azure Container Apps Preview enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while leaving behind the concerns of manually configuring cloud infrastructure and complex container orchestrators.
+The Azure Container Apps Preview service enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while you leave behind the concerns of manually configuring cloud infrastructure and complex container orchestrators.
In this quickstart, you create a secure Container Apps environment and deploy your first container app. ## Prerequisites -- Azure account with an active subscription.
+- An Azure account with an active subscription.
- If you don't have one, you [can create one for free](https://azure.microsoft.com/free/). - Install the [Azure CLI](/cli/azure/install-azure-cli).
az containerapp env create \
--resource-group $RESOURCE_GROUP \ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
+ --location $LOCATION
``` # [PowerShell](#tab/powershell)
az containerapp env create `
--resource-group $RESOURCE_GROUP ` --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID ` --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
+ --location $LOCATION
``` ## Create a container app
-Now that you have an environment created, you can deploy your first container app. Using the `containerapp create` command, deploy a container image to Azure Container Apps.
+Now that you have an environment created, you can deploy your first container app. With the `containerapp create` command, deploy a container image to Azure Container Apps.
# [Bash](#tab/bash)
By setting `--ingress` to `external`, you make the container app available to pu
## Verify deployment
-The `create` command returned the container app's fully qualified domain name. Copy this location to a web browser and you'll see the following message.
+The `create` command returned the fully qualified domain name for the container app. Copy this location to a web browser and see the following message:
:::image type="content" source="media/get-started/azure-container-apps-quickstart.png" alt-text="Your first Azure Container Apps deployment."::: ## Clean up resources
-If you're not going to continue to use this application, you can delete the Azure Container Apps instance and all the associated services by removing the resource group.
+If you're not going to continue to use this application, run the following command to delete the resource group along with all the resources created in this quickstart.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --name $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
container-apps Microservices Dapr Azure Resource Manager https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/microservices-dapr-azure-resource-manager.md
Title: 'Tutorial: Deploy a Dapr application to Azure Container Apps using an ARM or Bicep template'
-description: Deploy a Dapr application to Azure Container Apps using an ARM or Bicep template.
+ Title: 'Tutorial: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template'
+description: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template.
Previously updated : 11/02/2021 Last updated : 01/31/2022 zone_pivot_groups: container-apps
-# Tutorial: Deploy a Dapr application to Azure Container Apps using an ARM or Bicep template
+# Tutorial: Deploy a Dapr application to Azure Container Apps with an Azure Resource Manager or Bicep template
-[Dapr](https://dapr.io/) (Distributed Application Runtime) is a runtime that helps you build resilient stateless and stateful microservices. In this tutorial, a sample Dapr application is deployed to Azure Container Apps.
+[Dapr](https://dapr.io/) (Distributed Application Runtime) is a runtime that helps you build resilient stateless and stateful microservices. In this tutorial, a sample Dapr application is deployed to Azure Container Apps via an Azure Resource Manager (ARM) or Bicep template.
You learn how to: > [!div class="checklist"]+ > * Create a Container Apps environment for your container apps > * Create an Azure Blob Storage state store for the container app
-> * Deploy two apps that a produce and consume messages and persist them using the state store
+> * Deploy two apps that a produce and consume messages and persist them with the state store
> * Verify the interaction between the two microservices.
-Azure Container Apps offers a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
+With Azure Container Apps, you get a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
+
+In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart.
+
+The application consists of:
-In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart. The quickstart consists of a client (Python) app that generates messages, and a service (Node) app that consumes and persists those messages in a configured state store. The following architecture diagram illustrates the components that make up this tutorial:
+* A client (Python) container app to generate messages.
+* A service (Node) container app to consume and persist those messages in a state store
+
+The following architecture diagram illustrates the components that make up this tutorial:
:::image type="content" source="media/microservices-dapr/azure-container-apps-microservices-dapr.png" alt-text="Architecture diagram for Dapr Hello World microservices on Azure Container Apps"::: ## Prerequisites
-* [Azure CLI](/cli/azure/install-azure-cli)
+* Install [Azure CLI](/cli/azure/install-azure-cli)
::: zone pivot="container-apps-bicep"
In this tutorial, you deploy the same applications from the Dapr [Hello World](h
## Before you begin
-This guide makes use of the following environment variables:
+This guide uses the following environment variables:
# [Bash](#tab/bash)
$STORAGE_ACCOUNT_CONTAINER="mycontainer"
-The above snippet can be used to set the environment variables using bash, zsh, or PowerShell.
# [Bash](#tab/bash)
$STORAGE_ACCOUNT="<storage account name>"
-Choose a name for `STORAGE_ACCOUNT`. It will be created in a following step. Storage account names must be *unique within Azure*. It must be between 3 and 24 characters in length and may contain numbers and lowercase letters only.
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure*. Be from 3 to 24 characters in length and contain numbers and lowercase letters only.
## Setup
-Begin by signing in to Azure.
-
-Run the following command, and follow the prompts to complete the authentication process.
+First, sign in to Azure.
# [Bash](#tab/bash)
az upgrade
-Next, install the Azure Container Apps extension to the CLI.
+Next, install the Azure Container Apps extension for the Azure CLI.
# [Bash](#tab/bash)
With the CLI upgraded and a new resource group available, you can create a Conta
## Create an environment
-Azure Container Apps environments act as isolation boundaries between a group of container apps. Container Apps deployed to the same environment share the same virtual network and write logs to the same Log Analytics workspace.
+The Azure Container Apps environment acts as a secure boundary around a group of container apps. Container Apps deployed to the same environment share a virtual network and write logs to the same Log Analytics workspace.
-Azure Log Analytics is used to monitor your container app and is required when creating a Container Apps environment.
+Your container apps are monitored with Azure Log Analytics, which is required when you create a Container Apps environment.
-Create a new Log Analytics workspace with the following command:
+Create a Log Analytics workspace with the following command:
# [Bash](#tab/bash)
az containerapp env create `
### Create an Azure Blob Storage account
-Use the following command to create a new Azure Storage account.
+Use the following command to create an Azure Storage account.
# [Bash](#tab/bash)
New-AzStorageAccount -ResourceGroupName $RESOURCE_GROUP `
Once your Azure Blob Storage account is created, the following values are needed for subsequent steps in this tutorial.
-* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable you chose above.
+* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable.
-* `storage_container_name` is the value of `STORAGE_ACCOUNT_CONTAINER` defined above (for example, `mycontainer`). Dapr creates a container with this name if it doesn't already exist in your Azure Storage account.
+* `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER`variable.
-Get the storage account key with the following command.
+Dapr creates a container with this name when it doesn't already exist in your Azure Storage account.
+
+Get the storage account key with the following command:
# [Bash](#tab/bash)
$STORAGE_ACCOUNT_KEY=(Get-AzStorageAccountKey -ResourceGroupName $RESOURCE_GROUP
### Create Azure Resource Manager (ARM) templates
-Create two Azure Resource Manager (ARM) templates.
+Create two ARM templates.
-The ARM template has the Container App definition and a Dapr component definition.
+Each ARM template has a container app definition and a Dapr component definition.
The following example shows how your ARM template should look when configured for your Azure Blob Storage account.
Save the following file as *serviceapp.json*:
Create two Bicep templates.
-The Bicep template has the Container App definition and a Dapr component definition.
+Each Bicep template contains a container app definition and a Dapr component definition.
The following example shows how your Bicep template should look when configured for your Azure Blob Storage account.
resource pythonapp 'Microsoft.Web/containerApps@2021-03-01' = {
::: zone pivot="container-apps-arm"
-Now let's deploy the service Container App. Navigate to the directory in which you stored the ARM template file and run the command below.
+Now deploy the service Container App. Navigate to the directory in which you stored the ARM template file and run the following command:
# [Bash](#tab/bash)
New-AzResourceGroupDeployment `
::: zone pivot="container-apps-bicep"
-Now let's deploy the service Container App. Navigate to the directory in which you stored the Bicep template file and run the command below.
+Now deploy the service container. Navigate to the directory in which you stored the Bicep template file and run the following command:
-A warning (BCP081) may be displayed. This warning will have no effect on successfully deploying the Container App.
+A warning (BCP081) might be displayed. This warning has no effect on the successful deployment of the application.
# [Bash](#tab/bash)
New-AzResourceGroupDeployment `
-This command deploys the service (Node) app server on `targetPort: 3000` (the app's port) along with its accompanying Dapr sidecar configured with `"appId": "nodeapp",` and dapr `"appPort": 3000,` for service discovery and invocation. Your state store is configured using the `components` object of `"type": "state.azure.blobstorage"`, which enables the sidecar to persist state.
+This command deploys:
+
+* the service (Node) app server on `targetPort: 3000` (the app port)
+* its accompanying Dapr sidecar configured with `"appId": "nodeapp",` and dapr `"appPort": 3000,` for service discovery and invocation.
+
+Your state store is configured with the `components` object of `"type": "state.azure.blobstorage"`, which enables the sidecar to persist state.
## Deploy the client application (headless client)
-Run the command below to deploy the client container app.
+Run the following command to deploy the client container.
::: zone pivot="container-apps-arm"
New-AzResourceGroupDeployment `
::: zone pivot="container-apps-bicep"
-A warning (BCP081) may be displayed. This warning will have no effect on successfully deploying the Container App.
+A warning (BCP081) might be displayed. This warning has no effect on the successful deployment of the application.
# [Bash](#tab/bash)
This command deploys `pythonapp` that also runs with a Dapr sidecar that is used
### Confirm successful state persistence
-You can confirm the services are working correctly by viewing data in your Azure Storage account.
+You can confirm that the services are working correctly by viewing data in your Azure Storage account.
1. Open the [Azure portal](https://portal.azure.com) in your browser and navigate to your storage account.
-1. Select **Containers** on the left.
+1. Select **Containers** from the menu on the left side.
1. Select **mycontainer**. 1. Verify that you can see the file named `order` in the container.
-1. Click on the file.
+1. Select on the file.
-1. Click the **Edit** tab.
+1. Select the **Edit** tab.
-1. Click the **Refresh** button to observe updates.
+1. Select the **Refresh** button to observe updates.
### View Logs
-Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or from the command line. You may need to wait a few minutes for the analytics to arrive for the first time before you can query the logged data.
+Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or from the command line. Wait a few minutes for the analytics to arrive for the first time before you query the logged data.
Use the following command to view logs in bash or PowerShell.
nodeapp Successfully persisted state. PrimaryResult 2021-10-22
nodeapp Got a new order! Order ID: 63 PrimaryResult 2021-10-22T22:45:44.618Z ```
-> [!TIP]
-> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
- ## Clean up resources
-Once you're done, clean up your Container App resources by running the following command to delete your resource group.
+Once you are done, run the following command to delete your resource group along with all the resources you created in this tutorial.
# [Bash](#tab/bash)
Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
-This command deletes both container apps, the storage account, the container apps environment, and any other resources in the resource group.
- > [!NOTE] > Since `pythonapp` continuously makes calls to `nodeapp` with messages that get persisted into your configured state store, it is important to complete these cleanup steps to avoid ongoing billable operations. ++
+> [!TIP]
+> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
+ ## Next steps > [!div class="nextstepaction"]
container-apps Microservices Dapr https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-apps/microservices-dapr.md
You learn how to: > [!div class="checklist"]+ > * Create a Container Apps environment for your container apps > * Create an Azure Blob Storage state store for the container app
-> * Deploy two apps that produce and consume messages and persist them using the state store
+> * Deploy two apps that produce and consume messages and persist them in the state store
> * Verify the interaction between the two microservices.
-Azure Container Apps offers a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
-
-In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart, which consists of a client (Python) app that generates messages, and a service (Node) app that consumes and persists those messages in a configured state store. The following architecture diagram illustrates the components that make up this tutorial:
--
-## Prerequisites
-
-* [Azure CLI](/cli/azure/install-azure-cli)
-
-## Before you begin
-
-This guide makes use of the following environment variables:
-
-# [Bash](#tab/bash)
-
-```bash
-RESOURCE_GROUP="my-containerapps"
-LOCATION="canadacentral"
-CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-STORAGE_ACCOUNT_CONTAINER="mycontainer"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$RESOURCE_GROUP="my-containerapps"
-$LOCATION="canadacentral"
-$CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-$LOG_ANALYTICS_WORKSPACE="containerapps-logs"
-$STORAGE_ACCOUNT_CONTAINER="mycontainer"
-```
---
-The above snippet can be used to set the environment variables using bash, zsh, or PowerShell.
+With Azure Container Apps, you get a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
-# [Bash](#tab/bash)
+In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/hello-kubernetes) quickstart.
-```bash
-STORAGE_ACCOUNT="<storage account name>"
-```
+The application consists of:
-# [PowerShell](#tab/powershell)
-
-```powershell
-$STORAGE_ACCOUNT="<storage account name>"
-```
+* a client (Python) app that generates messages
+* a service (Node) app that consumes and persists those messages in a configured state store
--
-Choose a name for `STORAGE_ACCOUNT`. It will be created in a following step. Storage account names must be *unique within Azure* and between 3 and 24 characters in length and may contain numbers and lowercase letters only.
-
-## Setup
-
-Begin by signing in to Azure from the CLI.
-
-Run the following command, and follow the prompts to complete the authentication process.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az login
-```
+The following architecture diagram illustrates the components that make up this tutorial:
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az login
-```
---
-Ensure you're running the latest version of the CLI via the upgrade command.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az upgrade
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az upgrade
-```
---
-Next, install the Azure Container Apps extension to the CLI.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az extension add \
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az extension add `
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.2-py2.py3-none-any.whl
-```
---
-Now that the extension is installed, register the `Microsoft.Web` namespace.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az provider register --namespace Microsoft.Web
-```
-
-# [PowerShell](#tab/powershell)
-```azurecli
-az provider register --namespace Microsoft.Web
-```
-Create a resource group to organize the services related to your new container app.
+Individual container apps are deployed to an Azure Container Apps environment. To create the environment, run the following command:
# [Bash](#tab/bash) ```azurecli
-az group create \
- --name $RESOURCE_GROUP \
+az containerapp env create \
+ --name $CONTAINERAPPS_ENVIRONMENT \
+ --resource-group $RESOURCE_GROUP \
+ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
+ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location "$LOCATION" ``` # [PowerShell](#tab/powershell) ```azurecli
-az group create `
- --name $RESOURCE_GROUP `
+az containerapp env create `
+ --name $CONTAINERAPPS_ENVIRONMENT `
+ --resource-group $RESOURCE_GROUP `
+ --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
+ --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location "$LOCATION" ```
-With the CLI upgraded and a new resource group available, you can create a Container Apps environment and deploy your container app.
-
-## Create an environment
-
-Azure Container Apps environments act as isolation boundaries between a group of container apps. Container Apps deployed to the same environment are deployed in the same virtual network and write logs to the same Log Analytics workspace.
-
-Azure Log Analytics is used to monitor your container app and is required when creating a Container Apps environment.
-
-Create a new Log Analytics workspace with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az monitor log-analytics workspace create \
- --resource-group $RESOURCE_GROUP \
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
+## Set up a state store
-# [PowerShell](#tab/powershell)
+### Create an Azure Blob Storage account
-```azurecli
-az monitor log-analytics workspace create `
- --resource-group $RESOURCE_GROUP `
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
-
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure*, from 3 to 24 characters in length and must contain numbers and lowercase letters only.
-Next, retrieve the Log Analytics Client ID and client secret.
# [Bash](#tab/bash)
-Make sure to run each query separately to give enough time for the request to complete.
- ```bash
-LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv`
-```
-
-```bash
-LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv`
+STORAGE_ACCOUNT="<storage account name>"
``` # [PowerShell](#tab/powershell)
-Make sure to run each query separately to give enough time for the request to complete.
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv)
-```
- ```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=(az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv)
+$STORAGE_ACCOUNT="<storage account name>"
```
-Individual container apps are deployed to an Azure Container Apps environment. To create the environment, run the following command:
+Set the `STORAGE_ACCOUNT_CONTAINER` name.
# [Bash](#tab/bash)
-```azurecli
-az containerapp env create \
- --name $CONTAINERAPPS_ENVIRONMENT \
- --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
+```bash
+STORAGE_ACCOUNT_CONTAINER="mycontainer"
``` # [PowerShell](#tab/powershell)
-```azurecli
-az containerapp env create `
- --name $CONTAINERAPPS_ENVIRONMENT `
- --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
+```powershell
+$STORAGE_ACCOUNT_CONTAINER="mycontainer"
```
-## Set up a state store
-
-### Create an Azure Blob Storage account
-
-Use the following command to create a new Azure Storage account.
+Use the following command to create an Azure Storage account.
# [Bash](#tab/bash)
az storage account create \
# [PowerShell](#tab/powershell)
-```azurecli
-az storage account create `
- --name $STORAGE_ACCOUNT `
- --resource-group $RESOURCE_GROUP `
- --location "$LOCATION" `
- --sku Standard_RAGRS `
- --kind StorageV2
+```powershell
+New-AzStorageAccount -ResourceGroupName $RESOURCE_GROUP `
+ -Name $STORAGE_ACCOUNT `
+ -Location $LOCATION `
+ -SkuName Standard_RAGRS
``` Once your Azure Blob Storage account is created, the following values are needed for subsequent steps in this tutorial.
-* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable you chose above.
+* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable that you set previously.
-* `storage_container_name` is the value of `STORAGE_ACCOUNT_CONTAINER` defined above (for example, `mycontainer`). Dapr creates a container with this name if it doesn't already exist in your Azure Storage account.
+* `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER` variable. Dapr creates a container with this name when it doesn't already exist in your Azure Storage account.
-Get the storage account key with the following command.
+Get the storage account key with the following command:
# [Bash](#tab/bash)
echo $STORAGE_ACCOUNT_KEY
# [PowerShell](#tab/powershell) ```powershell
-$STORAGE_ACCOUNT_KEY=(az storage account keys list --resource-group $RESOURCE_GROUP --account-name $STORAGE_ACCOUNT --query '[0].value' --out tsv)
+$STORAGE_ACCOUNT_KEY=(Get-AzStorageAccountKey -ResourceGroupName $RESOURCE_GROUP -AccountName $STORAGE_ACCOUNT)| Where-Object -Property KeyName -Contains 'key1' | Select-Object -ExpandProperty Value
``` ```powershell
echo $STORAGE_ACCOUNT_KEY
### Configure the state store component
-Using the properties you sourced from the steps above, create a config file named *components.yaml*. This file helps enable your Dapr app to access your state store. The following example shows how your *components.yaml* file should look when configured for your Azure Blob Storage account:
+Create a config file named *components.yaml* with the properties that you sourced from the previous steps. This file helps enable your Dapr app to access your state store. The following example shows how your *components.yaml* file should look when configured for your Azure Blob Storage account:
```yaml # components.yaml for Azure Blob storage component
To use this file, make sure to replace the placeholder values between the `<>` b
## Deploy the service application (HTTP web server)
-Navigate to the directory in which you stored the *components.yaml* file and run the command below to deploy the service container app.
+Navigate to the directory in which you stored the *components.yaml* file and run the following command to deploy the service container app.
# [Bash](#tab/bash)
az containerapp create `
-This command deploys the service (Node) app server on `--target-port 3000` (the app's port) along with its accompanying Dapr sidecar configured with `--dapr-app-id nodeapp` and `--dapr-app-port 3000` for service discovery and invocation. Your state store is configured using `--dapr-components ./components.yaml`, which enables the sidecar to persist state.
+This command deploys:
+
+* the service (Node) app server on `--target-port 3000` (the app port)
+* its accompanying Dapr sidecar configured with `--dapr-app-id nodeapp` and `--dapr-app-port 3000` for service discovery and invocation
+
+Your state store is configured using `--dapr-components ./components.yaml`, which enables the sidecar to persist state.
## Deploy the client application (headless client)
-Run the command below to deploy the client container app.
+Run the following command to deploy the client container app.
# [Bash](#tab/bash)
This command deploys `pythonapp` that also runs with a Dapr sidecar that is used
### Confirm successful state persistence
-You can confirm the services are working correctly by viewing data in your Azure Storage account.
+You can confirm that the services are working correctly by viewing data in your Azure Storage account.
1. Open the [Azure portal](https://portal.azure.com) in your browser and navigate to your storage account.
-1. Select **Containers** on the left.
+1. Select **Containers** left side menu.
1. Select **mycontainer**. 1. Verify that you can see the file named `order` in the container.
-1. Click on the file.
+1. Select on the file.
-1. Click the **Edit** tab.
+1. Select the **Edit** tab.
-1. Click the **Refresh** button to observe how the data automatically updates.
+1. Select the **Refresh** button to observe how the data automatically updates.
### View Logs
-Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or with the CLI. You may need to wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
+Data logged via a container app are stored in the `ContainerAppConsoleLogs_CL` custom table in the Log Analytics workspace. You can view logs through the Azure portal or with the CLI. Wait a few minutes for the analytics to arrive for the first time before you are able to query the logged data.
Use the following CLI command to view logs on the command line.
az monitor log-analytics query \
# [PowerShell](#tab/powershell)
-```azurecli
-az monitor log-analytics query `
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" `
- --out table
+```powershell
+$queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5"
+$queryResults.Results
```
nodeapp Successfully persisted state. PrimaryResult 2021-10-22
nodeapp Got a new order! Order ID: 63 PrimaryResult 2021-10-22T22:45:44.618Z ```
-> [!TIP]
-> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
- ## Clean up resources
-Once you are done, clean up your Container App resources by running the following command to delete your resource group.
+Once you are done, run the following command to delete your resource group along with all the resources you created in this tutorial.
# [Bash](#tab/bash)
az group delete \
# [PowerShell](#tab/powershell)
-```azurecli
-az group delete `
- --resource-group $RESOURCE_GROUP
+```powershell
+Remove-AzResourceGroup -Name $RESOURCE_GROUP -Force
```
-This command deletes both container apps, the storage account, the container apps environment, and any other resources in the resource group.
-
-> [!NOTE]
+This command deletes the resource group that includes all of the resources created in this tutorial.
+ [!NOTE]
> Since `pythonapp` continuously makes calls to `nodeapp` with messages that get persisted into your configured state store, it is important to complete these cleanup steps to avoid ongoing billable operations.
+> [!TIP]
+> Having issues? Let us know on GitHub by opening an issue in the [Azure Container Apps repo](https://github.com/microsoft/azure-container-apps).
+ ## Next steps > [!div class="nextstepaction"]
container-registry Buffer Gate Public Content https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/buffer-gate-public-content.md
Title: Manage public content in private container registry
description: Practices and workflows in Azure Container Registry to manage dependencies on public images from Docker Hub and other public content - Previously updated : 06/17/2021+ Last updated : 02/01/2022 # Manage public content with Azure Container Registry
As a recommended one-time step, [import](container-registry-import-images.md) ba
`az acr import` doesn't require a local Docker installation. You can run it with a local installation of the Azure CLI or directly in Azure Cloud Shell. It supports images of any OS type, multi-architecture images, or OCI artifacts such as Helm charts.
+Depending on your organization's needs, you can import to a dedicated registry or a repository in a shared registry.
+
+# [Azure CLI](#tab/azure-cli)
Example: ```azurecli-interactive
az acr import \
--password <Docker Hub token> ```
-Depending on your organization's needs, you can import to a dedicated registry or a repository in a shared registry.
+# [PowerShell](#tab/azure-powershell)
+Example:
+
+```azurepowershell-interactive
+Import-AzContainerRegistryImage
+ -SourceImage library/busybox:latest
+ -ResourceGroupName $resourceGroupName
+ -RegistryName $RegistryName
+ -SourceRegistryUri docker.io
+ -TargetTag busybox:latest
+```
+ Credentials are required if the source registry is not available publicly or the admin user is disabled.
## Update image references
data-factory Author Global Parameters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/author-global-parameters.md
Previously updated : 05/12/2021 Last updated : 01/31/2022
After a global parameter is created, you can edit it by clicking the parameter's
:::image type="content" source="media/author-global-parameters/create-global-parameter-3.png" alt-text="Create global parameters":::
+Global parameters are stored as part of the /factory/{factory_name}-arm-template parameters.json.
+ ## Using global parameters in a pipeline Global parameters can be used in any [pipeline expression](control-flow-expression-language-functions.md). If a pipeline is referencing another resource such as a dataset or data flow, you can pass down the global parameter value via that resource's parameters. Global parameters are referenced as `pipeline().globalParameters.<parameterName>`.
$globalParametersJson = Get-Content $globalParametersFilePath
Write-Host "Parsing JSON..." $globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
-foreach ($gp in $globalParametersObject.GetEnumerator()) {
+foreach ($gp in $factoryFileObject.properties.globalParameters.GetEnumerator()) {
+ # foreach ($gp in $globalParametersObject.GetEnumerator()) {
Write-Host "Adding global parameter:" $gp.Key $globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]) $newGlobalParameters.Add($gp.Key, $globalParameterValue)
data-factory Data Factory Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-factory-troubleshoot-guide.md
Previously updated : 09/30/2021 Last updated : 01/28/2022
For more information, see [Getting started with Fiddler](https://docs.telerik.co
## General
+### REST continuation token NULL error
+
+**Error message:** {\"token\":null,\"range\":{\"min\":\..}
+
+**Cause:** When querying across multiple partitions/pages, backend service returns continuation token in JObject format with 3 properties: **token, min and max key ranges**, for instance, {\"token\":null,\"range\":{\"min\":\"05C1E9AB0DAD76\",\"max":\"05C1E9CD673398"}}). Depending on source data, querying can result 0 indicating missing token though there is more data to fetch.
+
+**Recommendation:** When the continuationToken is non-null, as the string {\"token\":null,\"range\":{\"min\":\"05C1E9AB0DAD76\",\"max":\"05C1E9CD673398"}}, it is required to call queryActivityRuns API again with the continuation token from the previous response. You need to pass the full string for the query API again. The activities will be returned in the subsequent pages for the query result. You should ignore that there is empty array in this page, as long as the full continuationToken value != null, you need continue querying. For more details, please refer to [REST api for pipeline run query.](/rest/api/datafactory/activity-runs/query-by-pipeline-run)
++ ### Activity stuck issue When you observe that the activity is running much longer than your normal runs with barely no progress, it may happen to be stuck. You can try canceling it and retry to see if it helps. If it's a copy activity, you can learn about the performance monitoring and troubleshooting from [Troubleshoot copy activity performance](copy-activity-performance-troubleshooting.md); if it's a data flow, learn from [Mapping data flows performance](concepts-data-flow-performance.md) and tuning guide.
For more troubleshooting help, try these resources:
* [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory) * [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory) * [Azure videos](https://azure.microsoft.com/resources/videos/index/)
-* [Microsoft Q&A question page](/answers/topics/azure-data-factory.html)
+* [Microsoft Q&A question page](/answers/topics/azure-data-factory.html)
data-factory Data Flow Expression Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-expression-functions.md
___
<a name="fromBase64" ></a> ### <code>fromBase64</code>
-<code><b>fromBase64(<i>&lt;value1&gt;</i> : string) => string</b></code><br/><br/>
-Decodes the given base64-encoded string.
+<code><b>fromBase64(<i>&lt;value1&gt;</i> : string, <i>&lt;encoding type&gt;</i> : string) => string</b></code><br/><br/>
+Decodes the given base64-encoded string. You can optionally pass the encoding type.
* ``fromBase64('Z3VuY2h1cw==') -> 'gunchus'``
+* ``fromBase64('SGVsbG8gV29ybGQ=', 'Windows-1252') -> 'Hello World'``
___
___
<a name="toBase64" ></a> ### <code>toBase64</code>
-<code><b>toBase64(<i>&lt;value1&gt;</i> : string) => string</b></code><br/><br/>
-Encodes the given string in base64.
-* ``toBase64('bojjus') -> 'Ym9qanVz'``
-___
+<code><b>toBase64(<i>&lt;value1&gt;</i> : string, <i>&lt;encoding type&gt;</i> : string]) => string</b></code><br/><br/>
+Encodes the given string in base64. You can optionally pass the encoding type
+* ``toBase64('bojjus') -> 'Ym9qanVz'``
+* ``toBase64('┬▒ 25000, Γé¼ 5.000,- |', 'Windows-1252') -> 'sSAyNTAwMCwggCA1LjAwMCwtIHw='``
+___
<a name="toBinary" ></a>
___
<a name="toString" ></a> ### <code>toString</code>
-<code><b>toString(<i>&lt;value&gt;</i> : any, [<i>&lt;number format/date format&gt;</i> : string]) => string</b></code><br/><br/>
-Converts a primitive datatype to a string. For numbers and date a format can be specified. If unspecified the system default is picked.Java decimal format is used for numbers. Refer to Java SimpleDateFormat for all possible date formats; the default format is yyyy-MM-dd.
+<code><b>toString(<i>&lt;value&gt;</i> : any, [<i>&lt;number format/date format&gt;</i> : string], [<i>&lt;date locale&gt;</i> : string]) => string</b></code><br/><br/>
+Converts a primitive datatype to a string. For numbers and date a format can be specified. If unspecified the system default is picked.Java decimal format is used for numbers. Refer to Java SimpleDateFormat for all possible date formats; the default format is yyyy-MM-dd. For date or timestamp a locale can be optionally specified.
* ``toString(10) -> '10'`` * ``toString('engineer') -> 'engineer'`` * ``toString(123456.789, '##,###.##') -> '123,456.79'``
Converts a primitive datatype to a string. For numbers and date a format can be
* ``toString(toDate('2018-12-31')) -> '2018-12-31'`` * ``isNull(toString(toDate('2018-12-31', 'MM/dd/yy'))) -> true`` * ``toString(4 == 20) -> 'false'``
-___
-
+* ``toString(toDate('12/31/18', 'MM/dd/yy', 'es-ES'), 'MM/dd/yy', 'de-DE')``
+ ___
<a name="toTimestamp" ></a>
data-factory Data Flow Troubleshoot Errors https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-troubleshoot-errors.md
Previously updated : 10/01/2021 Last updated : 01/21/2022 # Common error codes and messages
This article lists common error codes and messages reported by mapping data flow
If you're running the data flow in a debug test execution from a debug pipeline run, you might run into this condition more frequently. That's because Azure Data Factory throttles the broadcast timeout to 60 seconds to maintain a faster debugging experience. You can extend the timeout to the 300-second timeout of a triggered run. To do so, you can use the **Debug** > **Use Activity Runtime** option to use the Azure IR defined in your Execute Data Flow pipeline activity. -- **Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
+- **Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance, then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
- **Cause**: Broadcast has a default timeout of 60 seconds in debug runs and 300 seconds in job runs. On the broadcast join, the stream chosen for broadcast is too large to produce data within this limit. If a broadcast join isn't used, the default broadcast by dataflow can reach the same limit. - **Recommendation**: Turn off the broadcast option or avoid broadcasting large data streams for which the processing can take more than 60 seconds. Choose a smaller stream to broadcast. Large Azure SQL Data Warehouse tables and source files aren't typically good choices. In the absence of a broadcast join, use a larger cluster if this error occurs.
This article lists common error codes and messages reported by mapping data flow
- **Recommendation**: Set an alias if you're using a SQL function like min() or max(). ## Error code: DF-Executor-DriverError-- **Message**: INT96 is legacy timestamp type which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.
+- **Message**: INT96 is legacy timestamp type, which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.
- **Cause**: Driver error. - **Recommendation**: INT96 is a legacy timestamp type that's not supported by Azure Data Factory data flow. Consider upgrading the column type to the latest type.
This article lists common error codes and messages reported by mapping data flow
- **Recommendation**: Contact the Microsoft product team for more details about this problem. ## Error code: DF-Executor-PartitionDirectoryError-- **Message**: The specified source path has either multiple partitioned directories (for e.g. &lt;Source Path&gt;/<Partition Root Directory 1>/a=10/b=20, &lt;Source Path&gt;/&lt;Partition Root Directory 2&gt;/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example &lt;Source Path&gt;/&lt;Partition Root Directory 1&gt;/a=10/b=20, &lt;Source Path&gt;/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.
+- **Message**: The specified source path has either multiple partitioned directories (for example, &lt;Source Path&gt;/<Partition Root Directory 1>/a=10/b=20, &lt;Source Path&gt;/&lt;Partition Root Directory 2&gt;/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example &lt;Source Path&gt;/&lt;Partition Root Directory 1&gt;/a=10/b=20, &lt;Source Path&gt;/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.
- **Cause**: The source path has either multiple partitioned directories or a partitioned directory that has another file or non-partitioned directory. - **Recommendation**: Remove the partitioned root directory from the source path and read it through separate source transformation.
This article lists common error codes and messages reported by mapping data flow
## Error code: InvalidTemplate - **Message**: The pipeline expression cannot be evaluated. - **Cause**: The pipeline expression passed in the Data Flow activity isn't being processed correctly because of a syntax error.-- **Recommendation**: Check your activity in activity monitoring to verify the expression.
+- **Recommendation**: Check data flow activity name. Check expressions in activity monitoring to verify the expressions. For example, data flow activity name can not have a space or a hyphen.
## Error code: 2011 - **Message**: The activity was running on Azure Integration Runtime and failed to decrypt the credential of data store or compute connected via a Self-hosted Integration Runtime. Please check the configuration of linked services associated with this activity, and make sure to use the proper integration runtime type.
This article lists common error codes and messages reported by mapping data flow
## Error code: DF-Hive-InvalidBlobStagingConfiguration - **Message**: Blob storage staging properties should be specified. - **Cause**: An invalid staging configuration is provided in the Hive.-- **Recommendation**: Please check if the account key, account name and container are set properly in the related Blob linked service which is used as staging.
+- **Recommendation**: Please check if the account key, account name and container are set properly in the related Blob linked service, which is used as staging.
## Error code: DF-Hive-InvalidGen2StagingConfiguration - **Message**: ADLS Gen2 storage staging only support service principal key credential.
For more help with troubleshooting, see these resources:
- [Data Factory feature requests](/answers/topics/azure-data-factory.html) - [Azure videos](https://azure.microsoft.com/resources/videos/index/?sort=newest&services=data-factory) - [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)-- [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
+- [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
data-factory Solution Template Copy New Files Lastmodifieddate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/solution-template-copy-new-files-lastmodifieddate.md
Previously updated : 3/8/2019 Last updated : 01/31/2022 # Copy new and changed files by LastModifiedDate with Azure Data Factory
The template defines six parameters:
1. Go to template **Copy new files only by LastModifiedDate**. Create a **New** connection to your source storage store. The source storage store is where you want to copy files from.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate1.png" alt-text="Create a new connection to the source":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-1.png" alt-text="Create a new connection to the source":::
2. Create a **New** connection to your destination store. The destination store is where you want to copy files to.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate3.png" alt-text="Create a new connection to the destination":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-3.png" alt-text="Create a new connection to the destination":::
3. Select **Use this template**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate4.png" alt-text="Use this template":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-4.png" alt-text="Use this template":::
4. You will see the pipeline available in the panel, as shown in the following example:
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate5.png" alt-text="Show the pipeline":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-5.png" alt-text="Show the pipeline":::
5. Select **Debug**, write the value for the **Parameters** and select **Finish**. In the picture below, we set the parameters as following. - **FolderPath_Source** = sourcefolder
The template defines six parameters:
The example is indicating that the files, which have been last modified within the timespan (**2019-02-01T00:00:00Z** to **2019-03-01T00:00:00Z**) will be copied from the source path **sourcefolder/subfolder** to the destination path **destinationfolder/subfolder**. You can replace these with your own parameters.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate6.png" alt-text="Run the pipeline":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-6.png" alt-text="Run the pipeline":::
6. Review the result. You will see only the files last modified within the configured timespan has been copied to the destination store.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate7.png" alt-text="Review the result":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-7.png" alt-text="Review the result":::
7. Now you can add a tumbling windows trigger to automate this pipeline, so that the pipeline can always copy new and changed files only by LastModifiedDate periodically. Select **Add trigger**, and select **New/Edit**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate8.png" alt-text="Screenshot that highlights the New/Edit menu option that appears when you select Add trigger.":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-8.png" alt-text="Screenshot that highlights the New/Edit menu option that appears when you select Add trigger.":::
8. In the **Add Triggers** window, select **+ New**. 9. Select **Tumbling Window** for the trigger type, set **Every 15 minute(s)** as the recurrence (you can change to any interval time). Select **Yes** for Activated box, and then select **OK**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate10.png" alt-text="Create trigger":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-10.png" alt-text="Create trigger":::
10. Set the value for the **Trigger Run Parameters** as following, and select **Finish**. - **FolderPath_Source** = **sourcefolder**. You can replace with your folder in source data store.
The template defines six parameters:
- **LastModified_From** = **\@trigger().outputs.windowStartTime**. It is a system variable from the trigger determining the time when the pipeline was triggered last time. - **LastModified_To** = **\@trigger().outputs.windowEndTime**. It is a system variable from the trigger determining the time when the pipeline is triggered this time.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate11.png" alt-text="Input parameters":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-11.png" alt-text="Input parameters":::
11. Select **Publish All**.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate12.png" alt-text="Publish All":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-12.png" alt-text="Publish All":::
12. Create new files in your source folder of data source store. You are now waiting for the pipeline to be triggered automatically and only the new files will be copied to the destination store.
The template defines six parameters:
14. Review the result. You will see your pipeline will be triggered automatically every 15 minutes, and only the new or changed files from source store will be copied to the destination store in each pipeline run.
- :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate15.png" alt-text="Screenshot that shows the results that return when the pipeline is triggered.":::
+ :::image type="content" source="media/solution-template-copy-new-files-lastmodifieddate/copy-new-files-lastmodifieddate-15.png" alt-text="Screenshot that shows the results that return when the pipeline is triggered.":::
## Next steps
databox Data Box Customer Managed Encryption Key Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox/data-box-customer-managed-encryption-key-portal.md
To enable a customer-managed key for your existing Data Box order in the Azure p
![Customer-managed key URL](./media/data-box-customer-managed-encryption-key-portal/customer-managed-key-11.png) > [!IMPORTANT]
-> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault?view=azure-cli-latest#az_keyvault_set_policy).
+> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault#az_keyvault_set_policy).
## Change key
To change the key vault, key, and/or key version for the customer-managed key yo
![Save updated encryption settings - 1](./media/data-box-customer-managed-encryption-key-portal/customer-managed-key-17-a.png) > [!IMPORTANT]
-> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault?view=azure-cli-latest#az_keyvault_set_policy).
+> You must enable the `Get`, `UnwrapKey`, and `WrapKey` permissions on the key. To set the permissions in Azure CLI, see [az keyvault set-policy](/cli/azure/keyvault#az_keyvault_set_policy).
## Change identity
defender-for-cloud Supported Machines Endpoint Solutions Clouds https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-cloud/supported-machines-endpoint-solutions-clouds.md
Title: Microsoft Defender for Cloud's features according to OS, machine type, and cloud description: Learn about the availability of Microsoft Defender for Cloud features according to OS, machine type, and cloud deployment. Previously updated : 12/27/2021 Last updated : 02/01/2022
[!INCLUDE [Banner for top of topics](./includes/banner.md)]
-The two **tabs** below show the features of Microsoft Defender for Cloud that are available for Windows and Linux machines.
+The **tabs** below show the features of Microsoft Defender for Cloud that are available for Windows and Linux machines.
## Supported features for virtual machines and servers <a name="vm-server-features"></a> ### [**Windows machines**](#tab/features-windows)
-| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Enhanced security features required** |
+| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Defender for servers required** |
|--|::|::|::|::| | [Microsoft Defender for Endpoint integration](integration-defender-for-endpoint.md) | Γ£ö</br>(on supported versions) | Γ£ö</br>(on supported versions) | Γ£ö | Yes | | [Virtual machine behavioral analytics (and security alerts)](alerts-reference.md) | Γ£ö | Γ£ö | Γ£ö | Yes |
The two **tabs** below show the features of Microsoft Defender for Cloud that ar
### [**Linux machines**](#tab/features-linux)
-| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Enhanced security features required** |
+| **Feature** | **Azure Virtual Machines** | **Azure Virtual Machine Scale Sets** | **Azure Arc-enabled machines** | **Defender for servers required** |
|--|::|::|::|::| | [Microsoft Defender for Endpoint integration](integration-defender-for-endpoint.md) | Γ£ö | - | Γ£ö | Yes | | [Virtual machine behavioral analytics (and security alerts)](./azure-defender.md) | Γ£ö</br>(on supported versions) | Γ£ö</br>(on supported versions) | Γ£ö | Yes |
devtest-labs Samples Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/samples-powershell.md
Title: Azure PowerShell Samples
-description: Azure PowerShell Samples - Scripts to help you manage labs in Azure Lab Services
+description: Learn about Azure PowerShell scripts. These samples help you manage labs in Azure Lab Services.
Previously updated : 06/26/2020 Last updated : 02/02/2022 # Azure PowerShell samples for Azure Lab Services
-The following table includes links to sample Azure PowerShell scripts for Azure Lab Services.
+This article includes the sample Azure PowerShell scripts for Azure Lab Services.
+++
+This article includes the following samples:
| Script | Description |
+|- | |
+| [Add an external user to a lab](#add-an-external-user-to-a-lab) | This PowerShell script adds an external user to a lab in Azure DevTest Labs. |
+| [Add marketplace images to a lab](#add-a-marketplace-image-to-a-lab) | This PowerShell script adds marketplace images to a lab in Azure DevTest Labs. |
+| [Create a custom image from a virtual hard drive (VHD)](#create-a-custom-image-from-a-vhd-file) | This PowerShell script creates a custom image in a lab in Azure DevTest Labs. |
+| [Create a custom role in a lab](#create-a-custom-role-in-a-lab) | This PowerShell script creates a custom role in a lab in Azure Lab Services. |
+| [Set allowed virtual machine sizes in a lab](#set-allowed-virtual-machine-sizes) | This PowerShell script sets allowed virtual machine sizes in a lab. |
+
+## Prerequisites
+
+All of these scripts have the following prerequisite:
+
+- An existing lab. If you don't have one, follow this quickstart on how to [Create a lab in Azure portal](devtest-lab-create-lab.md).
+
+## Add an external user to a lab
+
+This sample PowerShell script adds an external user to a lab in Azure DevTest Labs.
++
+This script uses the following commands:
+
+| Command | Notes |
+|||
+| [Get-AzADUser](/powershell/module/az.resources/get-azaduser) | Retries the user object from Azure active directory. |
+| [New-AzRoleAssignment](/powershell/module/az.resources/new-azroleassignment) | Assigns the specified role to the specified principal, at the specified scope. |
+
+## Add a marketplace image to a lab
+
+This sample PowerShell script adds a marketplace image to a lab in Azure DevTest Labs.
++
+This script uses the following commands:
+
+| Command | Notes |
+|||
+| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
+| [Set-AzResource](/powershell/module/az.resources/set-azresource) | Modifies a resource. |
+| [New-AzResource](/powershell/module/az.resources/new-azresource) | Create a resource. |
+
+## Create a custom image from a VHD file
+
+This sample PowerShell script creates a custom image from a VHD file in Azure Lab Services.
++
+This script uses the following commands:
+
+| Command | Notes |
+|||
+| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
+| [Get-AzStorageAccountKey](/powershell/module/az.storage/get-azstorageaccountkey) | Gets the access keys for an Azure Storage account. |
+| [New-AzResourceGroupDeployment](/powershell/module/az.resources/new-azresourcegroupdeployment) | Adds an Azure deployment to a resource group. |
+
+## Create a custom role in a lab
+
+This sample PowerShell script creates a custom role to use in a lab in Azure DevTest Labs.
++
+This script uses the following commands:
+
+| Command | Notes |
|||
-|[Add an external user to a lab](scripts/add-external-user-to-lab.md)| This PowerShell script adds an external user to a lab in Azure DevTest Labs. |
-|[Add marketplace images to a lab](scripts/add-marketplace-images-to-lab.md)| This PowerShell script adds marketplace images to a lab in Azure DevTest Labs. |
-|[Create a custom image from a VHD](scripts/create-custom-image-from-vhd.md)| This PowerShell script creates a custom image in a lab in Azure DevTest Labs. |
-|[Create a custom role in a lab](scripts/create-custom-role-in-lab.md)| This PowerShell script creates a custom role in a lab in Azure Lab Services. |
-|[Set allowed VM sizes in a lab](scripts/set-allowed-vm-sizes-in-lab.md)| This PowerShell script sets allowed virtual machine (VM) sizes in a lab. |
+| [Get-AzProviderOperation](/powershell/module/az.resources/get-azprovideroperation) | Gets the operations for an Azure resource provider that are securable using Azure role-based access control. |
+| [Get-AzRoleDefinition](/powershell/module/az.resources/get-azroledefinition) | Lists all Azure roles that are available for assignment. |
+| [New-AzRoleDefinition](/powershell/module/az.resources/new-azroledefinition) | Creates a custom role. |
+
+## Set allowed virtual machine sizes
+
+This sample PowerShell script sets allowed virtual machine sizes in Azure Lab Services.
++
+| Command | Notes |
+|||
+| [Get-AzResource](/powershell/module/az.resources/get-azresource) | Gets resources. |
+| [Set-AzResource](/powershell/module/az.resources/set-azresource) | Modifies a resource. |