Updates from: 05/19/2021 03:06:38
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Deploy Custom Policies Devops https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/deploy-custom-policies-devops.md
Previously updated : 02/14/2020 Last updated : 05/18/2021
There are three primary steps required for enabling Azure Pipelines to manage cu
1. Create a web application registration in your Azure AD B2C tenant 1. Configure an Azure Repo
-1. Configure an Azure Pipeline
+1. Configure Azure Pipelines
> [!IMPORTANT]
-> Managing Azure AD B2C custom policies with an Azure Pipeline currently uses **preview** operations available on the Microsoft Graph API `/beta` endpoint. Use of these APIs in production applications is not supported. For more information, see the [Microsoft Graph REST API beta endpoint reference](/graph/api/overview?toc=.%2fref%2ftoc.json&view=graph-rest-beta&preserve-view=true).
+> Managing Azure AD B2C custom policies with Azure Pipelines currently uses **preview** operations available on the Microsoft Graph API `/beta` endpoint. Use of these APIs in production applications is not supported. For more information, see the [Microsoft Graph REST API beta endpoint reference](/graph/api/overview?toc=.%2fref%2ftoc.json&view=graph-rest-beta&preserve-view=true).
## Prerequisites * [Azure AD B2C tenant](tutorial-create-tenant.md), and credentials for a user in the directory with the [B2C IEF Policy Administrator](../active-directory/roles/permissions-reference.md#b2c-ief-policy-administrator) role * [Custom policies](tutorial-create-user-flows.md?pivots=b2c-custom-policy) uploaded to your tenant * [Management app](microsoft-graph-get-started.md) registered in your tenant with the Microsoft Graph API permission *Policy.ReadWrite.TrustFramework*
-* [Azure Pipeline](https://azure.microsoft.com/services/devops/pipelines/), and access to an [Azure DevOps Services project][devops-create-project]
+* [Azure Pipelines](https://azure.microsoft.com/services/devops/pipelines/), and access to an [Azure DevOps Services project][devops-create-project]
## Client credentials grant flow
With a management application registered, you're ready to configure a repository
1. In your project, navigate to **Repos** and select the **Files** page. Select an existing repository or create one for this exercise. 1. Create a folder named *B2CAssets*. Name the required placeholder file *README.md* and **Commit** the file. You can remove this file later, if you like. 1. Add your Azure AD B2C policy files to the *B2CAssets* folder. This includes the *TrustFrameworkBase.xml*, *TrustFrameWorkExtensions.xml*, *SignUpOrSignin.xml*, *ProfileEdit.xml*, *PasswordReset.xml*, and any other policies you've created. Record the filename of each Azure AD B2C policy file for use in a later step (they're used as PowerShell script arguments).
-1. Create a folder named *Scripts* in the root directory of the repository, name the placeholder file *DeployToB2c.ps1*. Don't commit the file at this point, you'll do so in a later step.
-1. Paste the following PowerShell script into *DeployToB2c.ps1*, then **Commit** the file. The script acquires a token from Azure AD and calls the Microsoft Graph API to upload the policies within the *B2CAssets* folder to your Azure AD B2C tenant.
+1. Create a folder named *Scripts* in the root directory of the repository, name the placeholder file *DeployToB2C.ps1*. Don't commit the file at this point, you'll do so in a later step.
+1. Paste the following PowerShell script into *DeployToB2C.ps1*, then **Commit** the file. The script acquires a token from Azure AD and calls the Microsoft Graph API to upload the policies within the *B2CAssets* folder to your Azure AD B2C tenant.
```PowerShell [Cmdletbinding()]
With a management application registered, you're ready to configure a repository
exit 0 ```
-## Configure your Azure pipeline
+## Configure Azure Pipelines
With your repository initialized and populated with your custom policy files, you're ready to set up the release pipeline.
active-directory-b2c Localization String Ids https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/localization-string-ids.md
The following example shows the use of some of the user interface elements in th
The Following are the IDs for a content definition with an ID of `api.phonefactor`, and [phone factor technical profile](phone-factor-technical-profile.md).
-| ID | Default value |
-| -- | - |
-| **button_verify** | Call Me |
-| **country_code_label** | Country Code |
-| **cancel_message** | The user has canceled multi-factor authentication |
-| **text_button_send_second_code** | send a new code |
-| **code_pattern** | \\d{6} |
-| **intro_mixed** | We have the following number on record for you. We can send a code via SMS or phone to authenticate you. |
-| **intro_mixed_p** | We have the following numbers on record for you. Choose a number that we can phone or send a code via SMS to authenticate you. |
-| **button_verify_code** | Verify Code |
-| **requiredField_code** | Please enter the verification code you received |
-| **invalid_code** | Please enter the 6-digit code you received |
-| **button_cancel** | Cancel |
-| **local_number_input_placeholder_text** | Phone number |
-| **button_retry** | Retry |
-| **alternative_text** | I don't have my phone |
-| **intro_phone_p** | We have the following numbers on record for you. Choose a number that we can phone to authenticate you. |
-| **intro_phone** | We have the following number on record for you. We will phone to authenticate you. |
-| **enter_code_text_intro** | Enter your verification code below, or |
-| **intro_entry_phone** | Enter a number below that we can phone to authenticate you. |
-| **intro_entry_sms** | Enter a number below that we can send a code via SMS to authenticate you. |
-| **button_send_code** | Send Code |
-| **invalid_number** | Please enter a valid phone number |
-| **intro_sms** | We have the following number on record for you. We will send a code via SMS to authenticate you. |
-| **intro_entry_mixed** | Enter a number below that we can send a code via SMS or phone to authenticate you. |
-| **number_pattern** | ^\\+(?:[0-9][\\x20-]?){6,14}[0-9]$ |
-| **intro_sms_p** |We have the following numbers on record for you. Choose a number that we can send a code via SMS to authenticate you. |
-| **requiredField_countryCode** | Please select your country code |
-| **requiredField_number** | Please enter your phone number |
-| **country_code_input_placeholder_text** |Country or region |
-| **number_label** | Phone Number |
-| **error_tryagain** | The phone number you provided is busy or unavailable. Please check the number and try again. |
-| **error_sms_throttled** | You hit the limit on the number of text messages. Try again shortly. |
-| **error_phone_throttled** | You hit the limit on the number of call attempts. Try again shortly. |
-| **error_throttled** | You hit the limit on the number of verification attempts. Try again shortly. |
-| **error_incorrect_code** | The verification code you have entered does not match our records. Please try again, or request a new code. |
-| **countryList** | See [the countries list](#phone-factor-authentication-page-example). |
-| **error_448** | The phone number you provided is unreachable. |
-| **error_449** | User has exceeded the number of retry attempts. |
-| **verification_code_input_placeholder_text** | Verification code |
+| ID | Default value | Page Layout Version |
+| -- | - | |
+| **button_verify** | Call Me | `All` |
+| **country_code_label** | Country Code | `All` |
+| **cancel_message** | The user has canceled multi-factor authentication | `All` |
+| **text_button_send_second_code** | send a new code | `All` |
+| **code_pattern** | \\d{6} | `All` |
+| **intro_mixed** | We have the following number on record for you. We can send a code via SMS or phone to authenticate you. | `All` |
+| **intro_mixed_p** | We have the following numbers on record for you. Choose a number that we can phone or send a code via SMS to authenticate you. | `All` |
+| **button_verify_code** | Verify Code | `All` |
+| **requiredField_code** | Please enter the verification code you received | `All` |
+| **invalid_code** | Please enter the 6-digit code you received | `All` |
+| **button_cancel** | Cancel | `All` |
+| **local_number_input_placeholder_text** | Phone number | `All` |
+| **button_retry** | Retry | `All` |
+| **alternative_text** | I don't have my phone | `All` |
+| **intro_phone_p** | We have the following numbers on record for you. Choose a number that we can phone to authenticate you. | `All` |
+| **intro_phone** | We have the following number on record for you. We will phone to authenticate you. | `All` |
+| **enter_code_text_intro** | Enter your verification code below, or | `All` |
+| **intro_entry_phone** | Enter a number below that we can phone to authenticate you. | `All` |
+| **intro_entry_sms** | Enter a number below that we can send a code via SMS to authenticate you. | `All` |
+| **button_send_code** | Send Code | `All` |
+| **invalid_number** | Please enter a valid phone number | `All` |
+| **intro_sms** | We have the following number on record for you. We will send a code via SMS to authenticate you. | `All` |
+| **intro_entry_mixed** | Enter a number below that we can send a code via SMS or phone to authenticate you. | `All` |
+| **number_pattern** | ^\\+(?:[0-9][\\x20-]?){6,14}[0-9]$ | `All` |
+| **intro_sms_p** |We have the following numbers on record for you. Choose a number that we can send a code via SMS to authenticate you. | `All` |
+| **requiredField_countryCode** | Please select your country code | `All` |
+| **requiredField_number** | Please enter your phone number | `All` |
+| **country_code_input_placeholder_text** |Country or region | `All` |
+| **number_label** | Phone Number | `All` |
+| **error_tryagain** | The phone number you provided is busy or unavailable. Please check the number and try again. | `All` |
+| **error_sms_throttled** | You hit the limit on the number of text messages. Try again shortly. | `>= 1.2.3` |
+| **error_phone_throttled** | You hit the limit on the number of call attempts. Try again shortly. | `>= 1.2.3` |
+| **error_throttled** | You hit the limit on the number of verification attempts. Try again shortly. | `>= 1.2.3` |
+| **error_incorrect_code** | The verification code you have entered does not match our records. Please try again, or request a new code. | `All` |
+| **countryList** | See [the countries list](#phone-factor-authentication-page-example). | `All` |
+| **error_448** | The phone number you provided is unreachable. | `All` |
+| **error_449** | User has exceeded the number of retry attempts. | `All` |
+| **verification_code_input_placeholder_text** | Verification code | `All` |
The following example shows the use of some of the user interface elements in the MFA enrollment page:
active-directory-b2c Tenant Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tenant-management.md
Previously updated : 05/03/2021 Last updated : 05/18/2021
Azure AD B2C relies the Azure AD platform. The following Azure AD features can b
| [Groups](../active-directory/fundamentals/active-directory-groups-create-azure-portal.md) | Groups can be used to manage administrative and user accounts.| Groups can be used to manage administrative accounts. [Consumer accounts](user-overview.md#consumer-user) don't support groups. | | [Inviting External Identities guests](../active-directory//external-identities/add-users-administrator.md)| You can invite guest users and configure External Identities features such as federation and sign-in with Facebook and Google accounts. | You can invite only a Microsoft account or an Azure AD user as a guest to your Azure AD tenant for accessing applications or managing tenants. For [consumer accounts](user-overview.md#consumer-user), you use Azure AD B2C user flows and custom policies to manage users and sign-up or sign-in with external identity providers, such as Google or Facebook. | | [Roles and administrators](../active-directory/fundamentals/active-directory-users-assign-role-azure-portal.md)| Fully supported for administrative and user accounts. | Roles are not supported with [consumer accounts](user-overview.md#consumer-user). Consumer accounts don't have access to any Azure resources.|
-| [Custom domain names](../active-directory/roles/permissions-reference.md#) | You can use Azure AD custom domains for administrative accounts only. | [Consumer accounts](user-overview.md#consumer-user) can sign in with a username, phone number, or any email address. You can use [custom domains](custom-domain.md) in your redirect URLs.|
-| [Conditional Access](../active-directory/roles/permissions-reference.md#) | Fully supported for administrative and user accounts. | A subset of Azure AD Conditional Access features is supported with [consumer accounts](user-overview.md#consumer-user) Lean how to configure Azure AD B2C [custom domain](conditional-access-user-flow.md).|
+| [Custom domain names](../active-directory/fundamentals/add-custom-domain.md) | You can use Azure AD custom domains for administrative accounts only. | [Consumer accounts](user-overview.md#consumer-user) can sign in with a username, phone number, or any email address. You can use [custom domains](custom-domain.md) in your redirect URLs.|
+| [Conditional Access](../active-directory/conditional-access/overview.md) | Fully supported for administrative and user accounts. | A subset of Azure AD Conditional Access features is supported with [consumer accounts](user-overview.md#consumer-user) Lean how to configure Azure AD B2C [conditional access](conditional-access-user-flow.md).|
## Other Azure resources in your tenant
active-directory Concept Condition Filters For Devices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-condition-filters-for-devices.md
+
+ Title: Filters for devices as a condition in Conditional Access policy - Azure Active Directory
+description: Use device filters in Conditional Access to enhance security posture
+++++ Last updated : 05/18/2021++++++++
+# Conditional Access: Filters for devices (preview)
+
+When creating Conditional Access policies, administrators have asked for the ability to target or exclude specific devices in their environment. The preview condition filters for devices give administrators this capability. Now you can target specific devices using [supported operators and device properties for filters](#supported-operators-and-device-properties-for-filters) and the other available assignment conditions in your Conditional Access policies.
++
+> [!IMPORTANT]
+> Filters for devices is currently in public preview. This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
+> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+
+## Common scenarios
+
+There are multiple scenarios that organizations can now enable using filters for devices condition. Below are some core scenarios with examples of how to use this new condition.
+
+- Restrict access to privileged resources like Microsoft Azure Management, to privileged users, accessing from [privileged or secure admin workstations](/security/compass/privileged-access-devices). For this scenario, organizations would create two Conditional Access policies:
+ - Policy 1: All users with the directory role of Global administrator, accessing the Microsoft Azure Management cloud app, and for Access controls, Grant access, but require multi-factor authentication and require device to be marked as compliant.
+ - Policy 2: All users with the directory role of Global administrator, accessing the Microsoft Azure Management cloud app, excluding filters for devices using rule expression device.extensionAttribute1 equals SAW and for Access controls, Block.
+- Block access to organization resources from devices running an unsupported Operating System version like Windows 7. For this scenario, organizations would create the following two Conditional Access policies:
+ - Policy 1: All users, accessing all cloud apps and for Access controls, Grant access, but require device to be marked as compliant or require device to be hybrid Azure AD joined.
+ - Policy 2: All users, accessing all cloud apps, including filters for devices using rule expression device.operatingSystem equals Windows and device.operatingSystemVersion startsWith "6.1" and for Access controls, Block.
+- Do not require multi-factor authentication for specific accounts like service accounts when used on specific devices like Teams phones or Surface Hub devices. For this scenario, organizations would create the following two Conditional Access policies:
+ - Policy 1: All users excluding service accounts, accessing all cloud apps, and for Access controls, Grant access, but require multi-factor authentication.
+ - Policy 2: Select users and groups and include group that contains service accounts only, accessing all cloud apps, excluding filters for devices using rule expression device.extensionAttribute2 not equals TeamsPhoneDevice and for Access controls, Block.
+
+## Create a Conditional Access policy
+
+Filters for devices are an option when creating a Conditional Access policy in the Azure portal or using the Microsoft Graph API.
+
+> [!IMPORTANT]
+> Device state and filters for devices cannot be used together in Conditional Access policy. Filters for devices provides more granular targeting including support for targeting device state information through the `trustType` and `isCompliant` property.
+
+The following steps will help create two Conditional Access policies to support the first scenario under [Common scenarios](#common-scenarios).
+
+Policy 1: All users with the directory role of Global administrator, accessing the Microsoft Azure Management cloud app, and for Access controls, Grant access, but require multi-factor authentication and require device to be marked as compliant.
+
+1. Sign in to the **Azure portal** as a global administrator, security administrator, or Conditional Access administrator.
+1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**.
+1. Select **New policy**.
+1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
+1. Under **Assignments**, select **Users and groups**.
+ 1. Under **Include**, select **Directory roles** and choose **Global administrator**.
+
+ > [!WARNING]
+ > Conditional Access policies support built-in roles. Conditional Access policies are not enforced for other role types including [administrative unit-scoped](../roles/admin-units-assign-roles.md) or [custom roles](../roles/custom-create.md).
+
+ 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
+ 1. Select **Done**.
+1. Under **Cloud apps or actions** > **Include**, select **Select apps**, and select **Microsoft Azure Management**.
+1. Under **Access controls** > **Grant**, select **Grant access**, **Require multi-factor authentication**, and **Require device to be marked as compliant**, then select **Select**.
+1. Confirm your settings and set **Enable policy** to **On**.
+1. Select **Create** to create to enable your policy.
+
+Policy 2: All users with the directory role of Global administrator, accessing the Microsoft Azure Management cloud app, excluding filters for devices using rule expression device.extensionAttribute1 not equals SAW and for Access controls, Block.
+
+1. Select **New policy**.
+1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
+1. Under **Assignments**, select **Users and groups**.
+ 1. Under **Include**, select **Directory roles** and choose **Global administrator**.
+
+ > [!WARNING]
+ > Conditional Access policies support built-in roles. Conditional Access policies are not enforced for other role types including [administrative unit-scoped](../roles/admin-units-assign-roles.md) or [custom roles](../roles/custom-create.md).
+
+ 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
+ 1. Select **Done**.
+1. Under **Cloud apps or actions** > **Include**, select **Select apps**, and select **Microsoft Azure Management**.
+1. Under **Conditions**, **Filters for devices (Preview)**.
+ 1. Toggle **Configure** to **Yes**.
+ 1. Set **Devices matching the rule** to **Exclude filtered devices from policy**.
+ 1. Set the property to `ExtensionAttribute1`, the operator to `Equals` and the value to `SAW`.
+ 1. Select **Done**.
+1. Under **Access controls** > **Grant**, select **Block access**, then select **Select**.
+1. Confirm your settings and set **Enable policy** to **On**.
+1. Select **Create** to create to enable your policy.
+
+### Filters for devices Graph API
+
+The filters for devices API is currently available in Microsoft Graph beta endpoint and can be accessed using https://graph.microsoft.com/beta/identity/conditionalaccess/policies/. You can configure filters for devices when creating a new Conditional Access policy or you can update an existing policy to configure filters for devices condition. To update an existing policy, you can do a patch call on the Microsoft Graph beta endpoint mentioned above by appending the policy ID of an existing policy and executing the following request body. The example here shows configuring a filters for devices condition excluding device that are not marked as SAW devices. The rule syntax can consist of more than one single expression. To learn more about the syntax, see rules with multiple expressions.
+
+```json
+{
+ "conditions": {
+ "devices": {
+ "deviceFilter": {
+ "mode": "exclude",
+ "rule": "device.extensionAttribute1 -ne \"SAW\""
+ }
+ }
+ }
+}
+```
+
+## Supported operators and device properties for filters
+
+The following device attributes can be used with filters for devices condition in Conditional Access.
+
+| Supported device attributes | Supported operators | Supported values | Example |
+| | | | |
+| deviceId | Equals, NotEquals, In, NotIn | A valid deviceId that is a GUID | (device.deviceid -eq ΓÇ£498c4de7-1aee-4ded-8d5d-000000000000ΓÇ¥) |
+| displayName | Equals, NotEquals, StartsWith, NotStartsWith, EndsWith, NotEndsWith, Contains, NotContains, In, NotIn | Any string | (device.displayName -contains ΓÇ£ABCΓÇ¥) |
+| manufacturer | Equals, NotEquals, StartsWith, NotStartsWith, EndsWith, NotEndsWith, Contains, NotContains, In, NotIn | Any string | (device.manufacturer -startsWith ΓÇ£MicrosoftΓÇ¥) |
+| mdmAppId | Equals, NotEquals, In, NotIn | A valid MDM application ID | (device.mdmAppId -in [ΓÇ£0000000a-0000-0000-c000-000000000000ΓÇ¥] |
+| model | Equals, NotEquals, StartsWith, NotStartsWith, EndsWith, NotEndsWith, Contains, NotContains, In, NotIn | Any string | (device.model -notContains ΓÇ£SurfaceΓÇ¥) |
+| operatingSystem | Equals, NotEquals, StartsWith, NotStartsWith, EndsWith, NotEndsWith, Contains, NotContains, In, NotIn | A valid operating system (like Windows, iOS, or Android) | (device.operatingSystem -eq ΓÇ£WindowsΓÇ¥) |
+| operatingSystemVersion | Equals, NotEquals, StartsWith, NotStartsWith, EndsWith, NotEndsWith, Contains, NotContains, In, NotIn | A valid operating system version (like 6.1 for Windows 7, 6.2 for Windows 8, or 10.0 for Windows 10) | (device.operatingSystemVersion -in [ΓÇ£10.0.18363ΓÇ¥, ΓÇ£10.0.19041ΓÇ¥, ΓÇ£10.0.19042ΓÇ¥]) |
+| pyhsicalIds | Contains, NotContains | As an example all Windows Autopilot devices store ZTDId (a unique value assigned to all imported Windows Autopilot devices) in device physicalIds property. | (device.devicePhysicalIDs -contains "[ZTDId]") |
+| profileType | Equals, NotEquals | A valid profile type set for a device. Supported values are: RegisteredDeviceΓÇ»(default), SecureVM (used for Windows VMs in Azure enabled with Azure AD sign in.), Printer (used for printers), Shared (used for shared devices), IoT (used for IoT devices) | (device.profileType -notIn [ΓÇ£PrinterΓÇ¥, ΓÇ£SharedΓÇ¥, ΓÇ£IoTΓÇ¥] |
+| systemLabels | Contains, NotContains | List of labels applied to the device by the system. Some of the supported values are: AzureResource (used for Windows VMs in Azure enabled with Azure AD sign in), M365Managed (used for devices managed using Microsoft Managed Desktop), MultiUser (used for shared devices) | (device.systemLabels -contains "M365Managed") |
+| trustType | Equals, NotEquals | A valid registered state for devices. Supported values are: AzureAD (used for Azure AD joined devices), ServerAD (used for Hybrid Azure AD joined devices), Workplace (used for Azure AD registered devices) | (device.trustType -notIn ΓÇÿServerAD, WorkplaceΓÇÖ) |
+| extensionAttribute1-15 | Equals, NotEquals, StartsWith, NotStartsWith, EndsWith, NotEndsWith, Contains, NotContains, In, NotIn | extensionAttributes1-15 are attributes that customers can use for device objects. Customers can update any of the extensionAttributes1 through 15 with custom values and use them in filters for devices condition in Conditional Access. Any string value can be used. | (device.extensionAttribute1 -eq ΓÇÿSAWΓÇÖ) |
+
+## Policy behavior with filters for devices
+
+Filters for devices (preview) condition in Conditional Access evaluates policy based on device attributes of a registered device in Azure AD and hence it is important to understand under what circumstances the policy is applied or not applied. The table below illustrates the behavior when filters for devices condition are configured.
+
+| Filters for devices condition | Device registration state | Device filter Applied
+| | | |
+| Include/exclude mode with positive operators (Equals, StartsWith, EndsWith, Contains, In) and use of any attributes | Unregistered device | No |
+| Include/exclude mode with positive operators (Equals, StartsWith, EndsWith, Contains, In) and use of attributes excluding extensionAttributes1-15 | Registered device | Yes, if criteria are met |
+| Include/exclude mode with positive operators (Equals, StartsWith, EndsWith, Contains, In) and use of attributes including extensionAttributes1-15 | Registered device managed by Intune | Yes, if criteria are met |
+| Include/exclude mode with positive operators (Equals, StartsWith, EndsWith, Contains, In) and use of attributes including extensionAttributes1-15 | Registered device not managed by Intune | Yes, if criteria are met and if device is compliant or Hybrid Azure AD joined |
+| Include/exclude mode with negative operators (NotEquals, NotStartsWith, NotEndsWith, NotContains, NotIn) and use of any attributes | Unregistered device | Yes |
+| Include/exclude mode with negative operators (NotEquals, NotStartsWith, NotEndsWith, NotContains, NotIn) and use of any attributes excluding extensionAttributes1-15 | Registered device | Yes, if criteria are met |
+| Include/exclude mode with negative operators (NotEquals, NotStartsWith, NotEndsWith, NotContains, NotIn) and use of any attributes including extensionAttributes1-15 | Registered device managed by Intune | Yes, if criteria are met |
+| Include/exclude mode with negative operators (NotEquals, NotStartsWith, NotEndsWith, NotContains, NotIn) and use of any attributes including extensionAttributes1-15 | Registered device not managed by Intune | Yes, if criteria are met and if device is compliant or Hybrid Azure AD joined |
+
+## Next steps
+
+- [Conditional Access: Conditions](concept-conditional-access-conditions.md)
+- [Common Conditional Access policies](concept-conditional-access-policy-common.md)
+- [Securing devices as part of the privileged access story](/security/compass/privileged-access-devices)
active-directory Concept Conditional Access Cloud Apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-conditional-access-cloud-apps.md
Title: Cloud apps or actions in Conditional Access policy - Azure Active Directory
-description: What are cloud apps or actions in an Azure AD Conditional Access policy
+ Title: Cloud apps, actions, and authentication context in Conditional Access policy - Azure Active Directory
+description: What are cloud apps, actions, and authentication context in an Azure AD Conditional Access policy
Previously updated : 10/16/2020 Last updated : 05/13/2021
-# Conditional Access: Cloud apps or actions
+# Conditional Access: Cloud apps, actions, and authentication context
-Cloud apps or actions are a key signal in a Conditional Access policy. Conditional Access policies allow administrators to assign controls to specific applications or actions.
+Cloud apps, actions, and authentication context are key signals in a Conditional Access policy. Conditional Access policies allow administrators to assign controls to specific applications, actions, or authentication context.
- Administrators can choose from the list of applications that include built-in Microsoft applications and any [Azure AD integrated applications](../manage-apps/what-is-application-management.md) including gallery, non-gallery, and applications published through [Application Proxy](../manage-apps/what-is-application-proxy.md).-- Administrators may choose to define policy not based on a cloud application but on a user action. We support two user actions
- - Register security information (preview) to enforce controls around the [combined security information registration experience](../authentication/howto-registration-mfa-sspr-combined.md)
- - Register or join devices (preview) to enforce controls when users [register](../devices/concept-azure-ad-register.md) or [join](../devices/concept-azure-ad-join.md) devices to Azure AD.
+- Administrators may choose to define policy not based on a cloud application but on a [user action](#user-actions) like **Register security information** or **Register or join devices (Preview)**, allowing Conditional Access to enforce controls around those actions.
+- Administrators can use [authentication context](#authentication-context-preview) to provide an extra layer of security inside of applications.
![Define a Conditional Access policy and specify cloud apps](./media/concept-conditional-access-cloud-apps/conditional-access-cloud-apps-or-actions.png)
Administrators can assign a Conditional Access policy to the following cloud app
- Virtual Private Network (VPN) - Windows Defender ATP
-Applications that are available to Conditional Access have gone through an onboarding and validation process. This does not include all Microsoft apps, as many are backend services and not meant to have policy directly applied to them. If you are looking for an application that is missing, you can contact the specific application team or make a request on [UserVoice](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=167259).
+Applications that are available to Conditional Access have gone through an onboarding and validation process. This list does not include all Microsoft apps, as many are backend services and not meant to have policy directly applied to them. If you are looking for an application that is missing, you can contact the specific application team or make a request on [UserVoice](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=167259).
### Office 365
The Microsoft Azure Management application includes multiple underlying services
> [!NOTE] > The Microsoft Azure Management application applies to Azure PowerShell, which calls the Azure Resource Manager API. It does not apply to Azure AD PowerShell, which calls Microsoft Graph.
-## Other applications
+### Other applications
In addition to the Microsoft apps, administrators can add any Azure AD registered application to Conditional Access policies. These applications may include:
User actions are tasks that can be performed by a user. Currently, Conditional A
- `Require multi-factor authentication` is the only access control available with this user action and all others are disabled. This restriction prevents conflicts with access controls that are either dependent on Azure AD device registration or not applicable to Azure AD device registration. - `Client apps` and `Device state` conditions are not available with this user action since they are dependent on Azure AD device registration to enforce Conditional Access policies. - When a Conditional Access policy is enabled with this user action, you must set **Azure Active Directory** > **Devices** > **Device Settings** - `Devices to be Azure AD joined or Azure AD registered require Multi-Factor Authentication` to **No**. Otherwise, the Conditional Access policy with this user action is not properly enforced. More information regarding this device setting can found in [Configure device settings](../devices/device-management-azure-portal.md#configure-device-settings).
-
+
+## Authentication context (Preview)
+
+Authentication context can be used to further secure data and actions in applications. These applications can be your own custom applications, custom line of business (LOB) applications, applications like SharePoint, or applications protected by Microsoft Cloud App Security (MCAS).
+
+For example, an organization may keep different files in SharePoint like the lunch menu or their secret BBQ sauce recipe. Everyone may have access to the lunch menu, but users who have access to the secret BBQ sauce recipe may need to access from a managed device and agree to specific terms of use.
+
+### Configure authentication contexts
+
+Authentication contexts are managed in the Azure portal under **Azure Active Directory** > **Security** > **Conditional Access** > **Authentication context**.
+
+![Manage authentication context in the Azure portal](./media/concept-conditional-access-cloud-apps/conditional-access-authentication-context-get-started.png)
+
+> [!WARNING]
+> * Deleting authentication context definitions is not possible during the preview.
+> * The preview is limited to a total of 25 authentication context definitions in the Azure portal.
+
+Create new authentication context definitions by selecting **New authentication context** in the Azure portal. Configure the following attributes:
+
+- **Display name** is the name that is used to identify the authentication context in Azure AD and across applications that consume authentication contexts. We recommend names that can be used across resources, like ΓÇ£trusted devicesΓÇ¥, to reduce the number of authentication contexts needed. Having a reduced set limits the number of redirects and provides a better end to end-user experience.
+- **Description** provides more information about the policies it is used by Azure AD administrators and those applying authentication contexts to resources.
+- **Publish to apps** checkbox when checked, advertises the authentication context to apps and makes them available to be assigned. If not checked the authentication context will be unavailable to downstream resources.
+- **ID** is read-only and used in tokens and apps for request-specific authentication context definitions. It is listed here for troubleshooting and development use cases.
+
+Administrators can then select published authentication contexts in their Conditional Access policies under **Assignments** > **Cloud apps or actions** > **Authentication context**.
+
+### Tag resources with authentication contexts
+
+For more information about authentication context use in applications, see the following articles.
+
+- [SharePoint Online](/microsoft-365/compliance/sensitivity-labels-teams-groups-sites?view=o365-worldwide#more-information-about-the-dependencies-for-the-authentication-context-option)
+- [Microsoft Cloud App Security](/cloud-app-security/session-policy-aad?branch=pr-en-us-2082#require-step-up-authentication-authentication-context)
+- Custom applications
+ ## Next steps - [Conditional Access: Conditions](concept-conditional-access-conditions.md)- - [Conditional Access common policies](concept-conditional-access-policy-common.md) - [Client application dependencies](service-dependencies.md)
active-directory Concept Conditional Access Conditions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-conditional-access-conditions.md
Previously updated : 03/17/2021 Last updated : 05/18/2021
The **Configure** toggle when set to **Yes** applies to checked items, when set
- This option includes applications like the Office desktop and phone applications. - Legacy authentication clients - Exchange ActiveSync clients
- - This includes all use of the Exchange ActiveSync (EAS) protocol.
+ - This selection includes all use of the Exchange ActiveSync (EAS) protocol.
- When policy blocks the use of Exchange ActiveSync the affected user will receive a single quarantine email. This email with provide information on why they are blocked and include remediation instructions if able.
- - Administrators can apply policy only to supported platforms (such as iOS, Android, and Windows) through the Conditional Access MS Graph API.
+ - Administrators can apply policy only to supported platforms (such as iOS, Android, and Windows) through the Conditional Access Microsoft Graph API.
- Other clients - This option includes clients that use basic/legacy authentication protocols that do not support modern authentication. - Authenticated SMTP - Used by POP and IMAP client's to send email messages.
This setting has an impact on access attempts made from the following mobile app
| Dynamics CRM app | Dynamics CRM | Windows 10, Windows 8.1, iOS, and Android | | Mail/Calendar/People app, Outlook 2016, Outlook 2013 (with modern authentication)| Exchange Online | Windows 10 | | MFA and location policy for apps. Device-based policies are not supported.| Any My Apps app service | Android and iOS |
-| Microsoft Teams Services - this controls all services that support Microsoft Teams and all its Client Apps - Windows Desktop, iOS, Android, WP, and web client | Microsoft Teams | Windows 10, Windows 8.1, Windows 7, iOS, Android, and macOS |
+| Microsoft Teams Services - this client app controls all services that support Microsoft Teams and all its Client Apps - Windows Desktop, iOS, Android, WP, and web client | Microsoft Teams | Windows 10, Windows 8.1, Windows 7, iOS, Android, and macOS |
| Office 2016 apps, Office 2013 (with modern authentication), [OneDrive sync client](/onedrive/enable-conditional-access) | SharePoint | Windows 8.1, Windows 7 | | Office 2016 apps, Universal Office apps, Office 2013 (with modern authentication), [OneDrive sync client](/onedrive/enable-conditional-access) | SharePoint Online | Windows 10 | | Office 2016 (Word, Excel, PowerPoint, OneNote only). | SharePoint | macOS |
The device state condition can be used to exclude devices that are hybrid Azure
For example, *All users* accessing the *Microsoft Azure Management* cloud app including **All device state** excluding **Device Hybrid Azure AD joined** and **Device marked as compliant** and for *Access controls*, **Block**. - This example would create a policy that only allows access to Microsoft Azure Management from devices that are either hybrid Azure AD joined or devices marked as compliant.
+> [!IMPORTANT]
+> Device state and filters for devices cannot be used together in Conditional Access policy. Filters for devices provides more granular targeting including support for targeting device state information through the `trustType` and `isCompliant` property.
+
+## Filters for devices (preview)
+
+There is a new optional condition in Conditional Access called filters for devices. When configuring filters for devices as a condition, organizations can choose to include or exclude devices based on filters using a rule expression on device properties. The rule expression for filters for devices can be authored using rule builder or rule syntax. This experience is similar to the one used for dynamic membership rules for groups. For more information see the article, [Conditional Access: Filters for devices (preview)](concept-condition-filters-for-devices.md).
+ ## Next steps - [Conditional Access: Grant](concept-conditional-access-grant.md)
active-directory Concept Conditional Access Policies https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-conditional-access-policies.md
The assignments portion controls the who, what, and where of the Conditional Acc
### Cloud apps or actions
-[Cloud apps or actions](concept-conditional-access-cloud-apps.md) can include or exclude cloud applications or user actions that will be subject to the policy.
+[Cloud apps or actions](concept-conditional-access-cloud-apps.md) can include or exclude cloud applications, user actions, or authentication contexts that will be subjected to the policy.
### Conditions
This assignment condition allows Conditional Access policies to target specific
This control is used to exclude devices that are hybrid Azure AD joined, or marked a compliant in Intune. This exclusion can be done to block unmanaged devices.
+#### Filters for devices (preview)
+
+This control allows targeting specific devices based on their attributes in a policy.
+ ## Access controls The access controls portion of the Conditional Access policy controls how a policy is enforced.
active-directory Concept Conditional Access Policy Common https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-conditional-access-policy-common.md
Previously updated : 07/02/2020 Last updated : 05/13/2021 -+
active-directory Groups Self Service Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-self-service-management.md
Previously updated : 12/02/2020 Last updated : 05/18/2021
Groups created in | Security group default behavior | Microsoft 365 group defaul
## Make a group available for user self-service
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with an account that's a global admin for the directory.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with an account that's been assigned the Global Administrator or Privileged Role Administrator role for the directory.
+ 1. Select **Groups**, and then select **General** settings.+
+ ![Azure Active Directory groups general settings](./media/groups-self-service-management/groups-settings-general.png)
+ 1. Set **Owners can manage group membership requests in the Access Panel** to **Yes**.
-1. Set **Restrict access to Groups in the Access Panel** to **No**.
-1. If you set **Users can create security groups in Azure portals** or **Users can create Microsoft 365 groups in Azure portals** to
- - **Yes**: All users in your Azure AD organization are allowed to create new security groups and add members to these groups. These new groups would also show up in the Access Panel for all other users. If the policy setting on the group allows it, other users can create requests to join these groups
+1. Set **Restrict user ability to access groups features in the Access Panel** to **No**.
+
+1. If you set **Users can create security groups in Azure portals, API or PowerShell** or **Users can create Microsoft 365 groups in Azure portals, API or PowerShell** to
+
+ - **Yes**: All users in your Azure AD organization are allowed to create new security groups and add members to these groups in Azure portals, API or PowerShell. These new groups would also show up in the Access Panel for all other users. If the policy setting on the group allows it, other users can create requests to join these groups.
- **No**: Users can't create groups and can't change existing groups for which they are an owner. However, they can still manage the memberships of those groups and approve requests from other users to join their groups.
+ These settings were recently changed to add support for API and PowerShell. For more information about this change, see the next section [Groups setting change](#groups-setting-change).
+ You can also use **Owners who can assign members as group owners in the Azure portal** to achieve more granular access control over self-service group management for your users. When users can create groups, all users in your organization are allowed to create new groups and then can, as the default owner, add members to these groups. You can't specify individuals who can create their own groups. You can specify individuals only for making another group member a group owner.
When users can create groups, all users in your organization are allowed to crea
> [!NOTE] > An Azure Active Directory Premium (P1 or P2) license is required for users to request to join a security group or Microsoft 365 group and for owners to approve or deny membership requests. Without an Azure Active Directory Premium license, users can still manage their groups in the Access Panel, but they can't create a group that requires owner approval in the Access Panel, and they can't request to join a group.
+## Groups setting change
+
+The current security groups and Microsoft 365 groups settings are being deprecated and replaced. The current settings are being replaced because they only control group creation in Azure portals and do not apply to API or PowerShell. The new settings control group creation in Azure portals, and also API and PowerShell.
+
+| Deprecated setting | New setting |
+| | |
+| Users can create security groups in Azure portals | Users can create security groups in Azure portals, API or PowerShell |
+| Users can create Microsoft 365 groups in Azure portals | Users can create Microsoft 365 groups in Azure portals, API or PowerShell |
+
+Until the current setting is fully deprecated, both settings will appear in the Azure portals. You should configure this new setting before the end of **May 2021**. To configure the security groups settings, you must be assigned the Global Administrator or Privileged Role Administrator role.
+
+![Azure Active Directory security groups setting change](./media/groups-self-service-management/security-groups-setting.png)
+
+The following table helps you decide which values to choose.
+
+| If you want this ... | Choose these values |
+| | |
+| Users can create groups using Azure portals, API or PowerShell | Set both settings to **Yes**. Changes can take up to 15 minutes to take effect. |
+| Users **can't** create groups using Azure portals, API or PowerShell | Set both settings to **No**. Changes can take up to 15 minutes to take effect. |
+| Users can create groups using Azure portals, but not using API or PowerShell | Not supported |
+| Users can create groups using API or PowerShell, but not using Azure portals | Not supported |
+
+The following table lists what happens for different values for these settings. It's not recommended to have the deprecated setting and the new setting set to different values.
+
+| Users can create groups using Azure portals | Users can create groups using Azure portals, API or PowerShell | Effect on your tenant |
+| :: | :: | |
+| Yes | Yes | Users can create groups using Azure portals, API or PowerShell. Changes can take up to 15 minutes to take effect.|
+| No | No | Users **can't** create groups using Azure portals, API or PowerShell. Changes can take up to 15 minutes to take effect. |
+| Yes | No | Users **can't** create groups using Azure portals, API or PowerShell. It's not recommended to have these settings set to different values. Changes can take up to 15 minutes to take effect. |
+| No | Yes | Until the **Users can create groups using Azure portals** setting is fully deprecated in **June 2021**, users can create groups using API or PowerShell, but not Azure portals. Starting sometime in **June 2021**, the **Users can create groups using Azure portals, API or PowerShell** setting will take effect and users can create groups using Azure portals, API or PowerShell. |
+ ## Next steps These articles provide additional information on Azure Active Directory.
active-directory Concept Identity Protection Risks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/identity-protection/concept-identity-protection-risks.md
Previously updated : 04/13/2021 Last updated : 05/18/2021
These risks can be calculated in real-time or calculated offline using Microsoft
| New country | Offline | This detection is discovered by [Microsoft Cloud App Security (MCAS)](/cloud-app-security/anomaly-detection-policy#activity-from-infrequent-country). This detection considers past activity locations to determine new and infrequent locations. The anomaly detection engine stores information about previous locations used by users in the organization. | | Activity from anonymous IP address | Offline | This detection is discovered by [Microsoft Cloud App Security (MCAS)](/cloud-app-security/anomaly-detection-policy#activity-from-anonymous-ip-addresses). This detection identifies that users were active from an IP address that has been identified as an anonymous proxy IP address. | | Suspicious inbox forwarding | Offline | This detection is discovered by [Microsoft Cloud App Security (MCAS)](/cloud-app-security/anomaly-detection-policy#suspicious-inbox-forwarding). This detection looks for suspicious email forwarding rules, for example, if a user created an inbox rule that forwards a copy of all emails to an external address. |
+| Azure AD threat intelligence | This risk detection type indicates sign-in activity that is unusual for the given user or is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. |
### Other risk detections
active-directory Migration Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migration-resources.md
Resources to help you migrate application access and authentication to Azure Act
| [Deployment plan: Migrating from AD FS to pass-through authentication](https://aka.ms/ADFSTOPTADPDownload)|Azure AD pass-through authentication helps users sign in to both on-premises and cloud-based applications by using the same password. This feature provides your users with a better experience since they have one less password to remember. It also reduces IT helpdesk costs because users are less likely to forget how to sign in when they only need to remember one password. When people sign in using Azure AD, this feature validates users' passwords directly against your on-premises Active Directory.| | [Deployment plan: Enabling Single Sign-on to a SaaS app with Azure AD](https://aka.ms/SSODPDownload) | Single sign-on (SSO) helps you access all the apps and resources you need to do business, while signing in only once, using a single user account. For example, after a user has signed in, the user can move from Microsoft Office, to SalesForce, to Box without authenticating (for example, typing a password) a second time. | [Deployment plan: Extending apps to Azure AD with Application Proxy](https://aka.ms/AppProxyDPDownload)| Providing access from employee laptops and other devices to on-premises applications has traditionally involved virtual private networks (VPNs) or demilitarized zones (DMZs). Not only are these solutions complex and hard to make secure, but they are costly to set up and manage. Azure AD Application Proxy makes it easier to access on-premises applications. |
-| [Deployment plans](../fundamentals/active-directory-deployment-plans.md) | Find more deployment plans for deploying features such as multi-Factor authentication, Conditional Access, user provisioning, seamless SSO, self-service password reset, and more! |
+| [Deployment plans](../fundamentals/active-directory-deployment-plans.md) | Find more deployment plans for deploying features such as multi-Factor authentication, Conditional Access, user provisioning, seamless SSO, self-service password reset, and more! |
+| [Migrating apps from Symentec SiteMinder to Azure AD](https://azure.microsoft.com/mediahandler/files/resourcefiles/migrating-applications-from-symantec-siteminder-to-azure-active-directory/Migrating-applications-from-Symantec-SiteMinder-to-Azure-Active-Directory.pdf) | Get step by step guidance on application migration and integration options with an example, that walks you through migrating applications from Symantec SiteMinder to Azure AD. |
active-directory Pim How To Start Security Review https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/privileged-identity-management/pim-how-to-start-security-review.md
This article describes how to create one or more access reviews for privileged A
12. Under **Review role membership**, select the privileged Azure AD roles to review. > [!NOTE]
- > - Roles selected here include both [permanent and eligible roles](../privileged-identity-management/pim-how-to-add-role-to-user.md).
> - Selecting more than one role will create multiple access reviews. For example, selecting five roles will create five separate access reviews. > - For roles with groups assigned to them, the access of each group linked with the role under review will be reviewed as a part of the access review. If you are creating an access review of **Azure AD roles**, the following shows an example of the Review membership list.
- ![Review membership pane listing Azure AD roles you can select](./media/pim-how-to-start-security-review/review-membership.png)
+ > [!NOTE]
+ > Selecting more than one role will create multiple access reviews. For example, selecting five roles will create five separate access reviews.
- If you are creating an access review of **Azure resource roles**, the following image shows an example of the Review membership list.
+1. In **assignment type**, scope the review by how the principal was assigned to the role. Choose **(Preview) eligible assignments only** to review eligible assignments (regardless of activation status when the review is created) or **(Preview) active assignments only** to review active assignments. Choose **all active and eligible assignments** to review all assignments regardless of type.
- ![Review membership pane listing Azure resource roles you can select](./media/pim-how-to-start-security-review/review-membership-azure-resource-roles.png)
+ ![Reviewers list of assignment types](./media/pim-how-to-start-security-review/assignment-type-select.png)
13. In the **Reviewers** section, select one or more people to review all the users. Or you can select to have the members review their own access.
active-directory Pim Resource Roles Start Access Review https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/privileged-identity-management/pim-resource-roles-start-access-review.md
The need for access to privileged Azure resource roles by employees changes over
1. Under **Review role membership**, select the privileged Azure roles to review. > [!NOTE]
- > - Roles selected here include both [permanent and eligible roles](../privileged-identity-management/pim-how-to-add-role-to-user.md).
- > - Selecting more than one role will create multiple access reviews. For example, selecting five roles will create five separate access reviews.
+ > Selecting more than one role will create multiple access reviews. For example, selecting five roles will create five separate access reviews.
If you are creating an access review of **Azure AD roles**, the following shows an example of the Review membership list.
- ![Review membership pane listing Azure AD roles you can select](./media/pim-resource-roles-start-access-review/review-membership.png)
+1. In **assignment type**, scope the review by how the principal was assigned to the role. Choose **(Preview) eligible assignments only** to review eligible assignments (regardless of activation status when the review is created) or **(Preview) active assignments only** to review active assignments. Choose **all active and eligible assignments** to review all assignments regardless of type.
- If you are creating an access review of **Azure resource roles**, the following image shows an example of the Review membership list.
-
- ![Review membership pane listing Azure resource roles you can select](./media/pim-resource-roles-start-access-review/review-membership-azure-resource-roles.png)
+ ![Reviewers list of assignment types](./media/pim-resource-roles-start-access-review/assignment-type-select.png)
1. In the **Reviewers** section, select one or more people to review all the users. Or you can select to have the members review their own access.
active-directory Reference Azure Monitor Sign Ins Log Schema https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/reports-monitoring/reference-azure-monitor-sign-ins-log-schema.md
This article describes the Azure Active Directory (Azure AD) sign-in log schema
| OperationVersion | - | The REST API version that's requested by the client. | | Category | - | For sign-ins, this value is always *SignIn*. | | TenantId | - | The tenant GUID that's associated with the logs. |
-| ResultType | - | The result of the sign-in operation can be *Success* or *Failure*. |
+| ResultType | - | The result of the sign-in operation can be *0* for success or an *error code* for failure or interrupted. |
| ResultSignature | - | Contains the error code, if any, for the sign-in operation. | | ResultDescription | N/A or blank | Provides the error description for the sign-in operation. | | riskDetail | riskDetail | Provides the 'reason' behind a specific state of a risky user, sign-in or a risk detection. The possible values are: `none`, `adminGeneratedTemporaryPassword`, `userPerformedSecuredPasswordChange`, `userPerformedSecuredPasswordReset`, `adminConfirmedSigninSafe`, `aiConfirmedSigninSafe`, `userPassedMFADrivenByRiskBasedPolicy`, `adminDismissedAllRiskForUser`, `adminConfirmedSigninCompromised`, `unknownFutureValue`. The value `none` means that no action has been performed on the user or sign-in so far. <br>**Note:** Details for this property require an Azure AD Premium P2 license. Other licenses return the value `hidden`. |
This article describes the Azure Active Directory (Azure AD) sign-in log schema
## Next steps * [Interpret audit logs schema in Azure Monitor](reference-azure-monitor-audit-log-schema.md)
-* [Read more about Azure platform logs](../../azure-monitor/essentials/platform-logs-overview.md)
+* [Read more about Azure platform logs](../../azure-monitor/essentials/platform-logs-overview.md)
active-directory Admin Units Add Manage Groups https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/admin-units-add-manage-groups.md
Previously updated : 03/10/2021 Last updated : 05/14/2021
In Azure Active Directory (Azure AD), you can add groups to an administrative unit for a more granular administrative scope of control.
-To prepare to use PowerShell and Microsoft Graph for administrative unit management, see [Get started](admin-units-manage.md#get-started).
+## Prerequisites
+
+- Azure AD Premium P1 or P2 license for each administrative unit administrator
+- Azure AD Free licenses for administrative unit members
+- Privileged Role Administrator or Global Administrator
+- AzureAD module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
## Add groups to an administrative unit You can add groups to an administrative unit by using the Azure portal, PowerShell, or Microsoft Graph.
-### Use the Azure portal
+### Azure portal
You can assign only individual groups to an administrative unit. There is no option to assign groups as a bulk operation. In the Azure portal, you can assign a group to an administrative unit in either of two ways:
You can assign only individual groups to an administrative unit. There is no opt
1. Select one or more groups to be assigned to the administrative unit, and then select the **Select** button.
-### Use PowerShell
+### PowerShell
In the following example, use the `Add-AzureADMSAdministrativeUnitMember` cmdlet to add the group to the administrative unit. The object ID of the administrative unit and the object ID of the group to be added are taken as arguments. Change the highlighted section as required for your specific environment.
$GroupObj = Get-AzureADGroup -Filter "displayname eq 'TestGroup'"
Add-AzureADMSAdministrativeUnitMember -Id $adminUnitObj.Id -RefObjectId $GroupObj.ObjectId ```
-### Use Microsoft Graph
+### Microsoft Graph API
Run the following commands:
Example
## View a list of groups in an administrative unit
-### Use the Azure portal
+### Azure portal
1. In the Azure portal, go to **Azure AD**.
Example
![Screenshot of the "Groups" pane displaying a list of groups in an administrative unit.](./media/admin-units-add-manage-groups/list-groups-in-admin-units.png)
-### Use PowerShell
+### PowerShell
To display a list of all the members of the administrative unit, run the following command:
Get-AzureADGroup -ObjectId $member.ObjectId
} ```
-### Use Microsoft Graph
+### Microsoft Graph API
Run the following command:
Body
## View a list of administrative units for a group
-### Use the Azure portal
+### Azure portal
1. In the Azure portal, go to **Azure AD**.
Body
![Screenshot of the "Administrative units" pane, displaying a list administrative units that a group is assigned to.](./media/admin-units-add-manage-groups/list-group-au.png)
-### Use PowerShell
+### PowerShell
Run the following command:
Run the following command:
Get-AzureADMSAdministrativeUnit | where { Get-AzureADMSAdministrativeUnitMember -ObjectId $_.ObjectId | where {$_.ObjectId -eq $groupObjId} } ```
-### Use Microsoft Graph
+### Microsoft Graph API
Run the following command:
https://graph.microsoft.com/v1.0/groups/{group-id}/memberOf/$/Microsoft.Graph.Ad
## Remove a group from an administrative unit
-### Use the Azure portal
+### Azure portal
You can remove a group from an administrative unit in the Azure portal in either of two ways:
You can remove a group from an administrative unit in the Azure portal in either
![Screenshot of the "Groups" pane, displaying a list of the groups in an administrative unit.](./media/admin-units-add-manage-groups/list-groups-in-admin-units.png)
-### Use PowerShell
+### PowerShell
Run the following command:
Run the following command:
Remove-AzureADMSAdministrativeUnitMember -ObjectId $adminUnitId -MemberId $memberGroupObjId ```
-### Use Microsoft Graph
+### Microsoft Graph API
Run the following command:
active-directory Admin Units Add Manage Users https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/admin-units-add-manage-users.md
Previously updated : 11/04/2020 Last updated : 05/14/2021
In Azure Active Directory (Azure AD), you can add users to an administrative unit for a more granular administrative scope of control.
-To prepare to use PowerShell and Microsoft Graph for administrative unit management, see [Get started](admin-units-manage.md#get-started).
+## Prerequisites
+
+- Azure AD Premium P1 or P2 license for each administrative unit administrator
+- Azure AD Free licenses for administrative unit members
+- Privileged Role Administrator or Global Administrator
+- AzureAD module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
## Add users to an administrative unit
-### Use the Azure portal
+### Azure portal
You can assign users to administrative units individually or as a bulk operation. - Assign individual users from a user profile:
- 1. Sign in to the [Azure AD admin center](https://portal.azure.com) with Privileged Role Administrator permissions.
+ 1. Sign in to the [Azure AD admin center](https://portal.azure.com).
1. Select **Users** and then, to open the user's profile, select the user to be assigned to an administrative unit.
You can assign users to administrative units individually or as a bulk operation
- Assign individual users from an administrative unit:
- 1. Sign in to the [Azure AD admin center](https://portal.azure.com) with Privileged Role Administrator permissions.
+ 1. Sign in to the [Azure AD admin center](https://portal.azure.com).
1. Select **Administrative units**, and then select the administrative unit where the user is to be assigned. 1. Select **All users**, select **Add member** and then, on the **Add member** pane, select one or more users that you want to assign to the administrative unit.
You can assign users to administrative units individually or as a bulk operation
- Assign users as a bulk operation:
- 1. Sign in to the [Azure AD admin center](https://portal.azure.com) with Privileged Role Administrator permissions.
+ 1. Sign in to the [Azure AD admin center](https://portal.azure.com).
1. Select **Administrative units**.
You can assign users to administrative units individually or as a bulk operation
![Screenshot of the "Users" pane for assigning users to an administrative unit as a bulk operation.](./media/admin-units-add-manage-users/bulk-assign-to-admin-unit.png)
-### Use PowerShell
+### PowerShell
In PowerShell, use the `Add-AzureADAdministrativeUnitMember` cmdlet in the following example to add the user to the administrative unit. The object ID of the administrative unit to which you want to add the user and the object ID of the user you want to add are taken as arguments. Change the highlighted section as required for your specific environment.
Add-AzureADMSAdministrativeUnitMember -Id $adminUnitObj.Id -RefObjectId $userObj
```
-### Use Microsoft Graph
+### Microsoft Graph API
Replace the placeholder with test information and run the following command:
Example
## View a list of administrative units for a user
-### Use the Azure portal
+### Azure portal
In the Azure portal, you can open a user's profile by doing the following:
In the Azure portal, you can open a user's profile by doing the following:
![Screenshot of administrative units to which a user has been assigned.](./media/admin-units-add-manage-users/list-user-admin-units.png)
-### Use PowerShell
+### PowerShell
Run the following command:
Get-AzureADMSAdministrativeUnit | where { Get-AzureADMSAdministrativeUnitMember
> [!NOTE] > By default, `Get-AzureADAdministrativeUnitMember` returns only 100 members of an administrative unit. To retrieve more members, you can add `"-All $true"`.
-### Use Microsoft Graph
+### Microsoft Graph API
Replace the placeholder with test information and run the following command:
https://graph.microsoft.com/v1.0/users/{user-id}/memberOf/$/Microsoft.Graph.Admi
## Remove a single user from an administrative unit
-### Use the Azure portal
+### Azure portal
You can remove a user from an administrative unit in either of two ways:
You can remove a user from an administrative unit in either of two ways:
![Screenshot showing how to remove a user at the administrative unit level.](./media/admin-units-add-manage-users/admin-units-remove-user.png)
-### Use PowerShell
+### PowerShell
Run the following command:
Run the following command:
Remove-AzureADMSAdministrativeUnitMember -Id $adminUnitId -MemberId $memberUserObjId ```
-### Use Microsoft Graph
+### Microsoft Graph API
Replace the placeholders with test information and run the following command:
active-directory Admin Units Assign Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/admin-units-assign-roles.md
Previously updated : 04/14/2021 Last updated : 05/14/2021
In Azure Active Directory (Azure AD), for more granular administrative control, you can assign users to an Azure AD role with a scope that's limited to one or more administrative units.
-To prepare to use PowerShell and Microsoft Graph for administrative unit management, see [Get started](admin-units-manage.md#get-started).
+## Prerequisites
+
+- Azure AD Premium P1 or P2 license for each administrative unit administrator
+- Azure AD Free licenses for administrative unit members
+- Privileged Role Administrator or Global Administrator
+- AzureAD module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+ ## Available roles
The following security principals can be assigned to a role with an administrati
You can assign a scoped role by using the Azure portal, PowerShell, or Microsoft Graph.
-### Use the Azure portal
+### Azure portal
1. In the Azure portal, go to **Azure AD**.
You can assign a scoped role by using the Azure portal, PowerShell, or Microsoft
> [!Note] > To assign a role on an administrative unit by using Azure AD Privileged Identity Management (PIM), see [Assign Azure AD roles in PIM](../privileged-identity-management/pim-how-to-add-role-to-user.md?tabs=new#assign-a-role-with-restricted-scope).
-### Use PowerShell
+### PowerShell
```powershell $adminUser = Get-AzureADUser -ObjectId "Use the user's UPN, who would be an admin on this unit"
Add-AzureADMSScopedRoleMembership -Id $adminUnitObj.Id -RoleId $role.ObjectId -R
You can change the highlighted section as required for the specific environment.
-### Use Microsoft Graph
+### Microsoft Graph API
Request
Body
You can view a list of scoped admins by using the Azure portal, PowerShell, or Microsoft Graph.
-### Use the Azure portal
+### Azure portal
You can view all the role assignments created with an administrative unit scope in the [Administrative units section of Azure AD](https://ms.portal.azure.com/?microsoft_aad_iam_adminunitprivatepreview=true&microsoft_aad_iam_rbacv2=true#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/AdminUnit).
You can view all the role assignments created with an administrative unit scope
1. Select **Roles and administrators**, and then open a role to view the assignments in the administrative unit.
-### Use PowerShell
+### PowerShell
```powershell $adminUnitObj = Get-AzureADMSAdministrativeUnit -Filter "displayname eq 'The display name of the unit'"
Get-AzureADMSScopedRoleMembership -Id $adminUnitObj.Id | fl *
You can change the highlighted section as required for your specific environment.
-### Use Microsoft Graph
+### Microsoft Graph API
Request
active-directory Admin Units Manage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/admin-units-manage.md
Previously updated : 11/04/2020 Last updated : 05/14/2021
For more granular administrative control in Azure Active Directory (Azure AD), you can assign users to an Azure AD role with a scope that's limited to one or more administrative units.
-## Get started
-1. To run queries from the following instructions via [Graph Explorer](https://aka.ms/ge), do the following:
+## Prerequisites
- a. In the Azure portal, go to Azure AD.
-
- b. In the applications list, select **Graph explorer**.
-
- c. On the **Permissions** pane, select **Grant admin consent for Graph explorer**.
+- Azure AD Premium P1 or P2 license for each administrative unit administrator
+- Azure AD Free licenses for administrative unit members
+- Privileged Role Administrator or Global Administrator
+- AzureAD module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
- ![Screenshot showing the "Grant admin consent for Graph explorer" link.](./media/admin-units-manage/select-graph-explorer.png)
--
-1. Use [Azure AD PowerShell](https://www.powershellgallery.com/packages/AzureAD/).
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
## Add an administrative unit You can add an administrative unit by using either the Azure portal or PowerShell.
-### Use the Azure portal
+### Azure portal
1. In the Azure portal, go to Azure AD. Then, on the left pane, select **Administrative units**.
You can add an administrative unit by using either the Azure portal or PowerShel
1. Select the blue **Add** button to finalize the administrative unit.
-### Use PowerShell
-
-Install [Azure AD PowerShell](https://www.powershellgallery.com/packages/AzureAD/) before you try to run the following commands:
+### PowerShell
```powershell Connect-AzureAD
New-AzureADMSAdministrativeUnit -Description "West Coast region" -DisplayName "W
You can modify the values that are enclosed in quotation marks, as required.
-### Use Microsoft Graph
+### Microsoft Graph API
Request
Body
In Azure AD, you can remove an administrative unit that you no longer need as a unit of scope for administrative roles.
-### Use the Azure portal
+### Azure portal
1. In the Azure portal, go to **Azure AD**, and then select **Administrative units**. 1. Select the administrative unit to be deleted, and then select **Delete**.
In Azure AD, you can remove an administrative unit that you no longer need as a
![Screenshot of the administrative unit Delete button and confirmation window.](./media/admin-units-manage/select-admin-unit-to-delete.png)
-### Use PowerShell
+### PowerShell
```powershell $adminUnitObj = Get-AzureADMSAdministrativeUnit -Filter "displayname eq 'DeleteMe Admin Unit'"
Remove-AzureADMSAdministrativeUnit -Id $adminUnitObj.Id
You can modify the values that are enclosed in quotation marks, as required for the specific environment.
-### Use the Graph API
+### Microsoft Graph API
Request
active-directory Administrative Units https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/administrative-units.md
A central administrator could:
## License requirements
-To use administrative units, you need an Azure Active Directory Premium license for each administrative unit admin, and Azure Active Directory Free licenses for administrative unit members. For more information, see [Getting started with Azure AD Premium](../fundamentals/active-directory-get-started-premium.md).
+Using administrative units requires an Azure AD Premium P1 license for each administrative unit administrator, and Azure AD Free licenses for administrative unit members. To find the right license for your requirements, see [Comparing generally available features of the Free and Premium editions](https://azure.microsoft.com/pricing/details/active-directory/).
## Manage administrative units
You can expect the creation of administrative units in the organization to go th
## Currently supported scenarios
-As a Global Administrator or a Privileged Role Administrator, you can use the Azure AD portal to:
+As a Global Administrator or a Privileged Role Administrator, you can use the Azure portal to:
- Create administrative units - Add users and groups members of administrative units
The following sections describe current support for administrative unit scenario
### Administrative unit management
-| Permissions | Graph/PowerShell | Azure AD portal | Microsoft 365 admin center |
+| Permissions | Graph/PowerShell | Azure portal | Microsoft 365 admin center |
| | | | | | Creating and deleting administrative units | Supported | Supported | Not supported | | Adding and removing administrative unit members individually | Supported | Supported | Not supported |
The following sections describe current support for administrative unit scenario
### User management
-| Permissions | Graph/PowerShell | Azure AD portal | Microsoft 365 admin center |
+| Permissions | Graph/PowerShell | Azure portal | Microsoft 365 admin center |
| | | | | | Administrative unit-scoped management of user properties, passwords, and licenses | Supported | Supported | Supported | | Administrative unit-scoped blocking and unblocking of user sign-ins | Supported | Supported | Supported |
The following sections describe current support for administrative unit scenario
### Group management
-| Permissions | Graph/PowerShell | Azure AD portal | Microsoft 365 admin center |
+| Permissions | Graph/PowerShell | Azure portal | Microsoft 365 admin center |
| | | | | | Administrative unit-scoped management of group properties and members | Supported | Supported | Not supported | | Administrative unit-scoped management of group licensing | Supported | Supported | Not supported |
-Administrative units apply scope only to management permissions. They don't prevent members or administrators from using their [default user permissions](../fundamentals/users-default-permissions.md) to browse other users, groups, or resources outside the administrative unit. In the Microsoft 365 admin center, users outside a scoped admin's administrative units are filtered out. But you can browse other users in the Azure AD portal, PowerShell, and other Microsoft services.
+Administrative units apply scope only to management permissions. They don't prevent members or administrators from using their [default user permissions](../fundamentals/users-default-permissions.md) to browse other users, groups, or resources outside the administrative unit. In the Microsoft 365 admin center, users outside a scoped admin's administrative units are filtered out. But you can browse other users in the Azure portal, PowerShell, and other Microsoft services.
## Next steps
active-directory Concept Understand Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/concept-understand-roles.md
When we say separate role-based access control system. it means there is a diffe
## Why some Azure AD roles are for other services
-Microsoft 365 has a number of role-based access control systems that developed independently over time, each with its own service portal. To make it convenient for you to manage identity across Microsoft 365 from the Azure AD portal, we have added some service-specific built-in roles, each of which grants administrative access to a Microsoft 365 service. An example of this addition is the Exchange Administrator role in Azure AD. This role is equivalent to the [Organization Management role group](/exchange/organization-management-exchange-2013-help) in the Exchange role-based access control system, and can manage all aspects of Exchange. Similarly, we added the Intune Administrator role, Teams Administrator, SharePoint Administrator, and so on. Service-specific roles is one category of Azure AD built-in roles in the following section.
+Microsoft 365 has a number of role-based access control systems that developed independently over time, each with its own service portal. To make it convenient for you to manage identity across Microsoft 365 from the Azure portal, we have added some service-specific built-in roles, each of which grants administrative access to a Microsoft 365 service. An example of this addition is the Exchange Administrator role in Azure AD. This role is equivalent to the [Organization Management role group](/exchange/organization-management-exchange-2013-help) in the Exchange role-based access control system, and can manage all aspects of Exchange. Similarly, we added the Intune Administrator role, Teams Administrator, SharePoint Administrator, and so on. Service-specific roles is one category of Azure AD built-in roles in the following section.
## Categories of Azure AD roles
active-directory Custom Assign Graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-assign-graph.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
You can automate how you assign roles to user accounts using the Microsoft Graph API. This article covers POST, GET, and DELETE operations on roleAssignments.
-## Required permissions
+## Prerequisites
-Connect to your Azure AD organization using a Global administrator or Privileged role administrator account to assign or remove roles.
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
## POST Operations on RoleAssignment
active-directory Custom Assign Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-assign-powershell.md
Previously updated : 11/04/2020 Last updated : 05/14/2021
This article describes how to create a role assignment at organization-wide scop
For more information about Azure AD admin roles, see [Assigning administrator roles in Azure Active Directory](permissions-reference.md).
-## Required permissions
+## Prerequisites
-Connect to your Azure AD organization using a global administrator account to assign or remove roles.
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
-## Prepare PowerShell
-
-Install the Azure AD PowerShell module from the [PowerShell Gallery](https://www.powershellgallery.com/packages/AzureADPreview). Then import the Azure AD PowerShell preview module, using the following command:
-
-``` PowerShell
-Import-Module -Name AzureADPreview
-```
-
-To verify that the module is ready to use, match the version returned by the following command to the one listed here:
-
-``` PowerShell
-Get-Module -Name AzureADPreview
- ModuleType Version Name ExportedCommands
- - - -
- Binary 2.0.0.115 AzureADPreview {Add-AzureADMSAdministrati...}
-```
-
-Now you can start using the cmdlets in the module. For a full description of the cmdlets in the Azure AD module, see the online reference documentation for [Azure AD preview module](https://www.powershellgallery.com/packages/AzureADPreview).
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
## Assign a directory role to a user or service principal with resource scope
active-directory Custom Available Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-available-permissions.md
Ability to update the delegated permissions, application permissions, authorized
Grants the same permissions as microsoft.directory/applications/permissions/update, but only for single-tenant applications.
-## Required license plan
+## License requirements
[!INCLUDE [License requirement for using custom roles in Azure AD](../../../includes/active-directory-p1-license.md)]
active-directory Custom Consent Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-consent-permissions.md
This article contains the currently available app consent permissions for custom role definitions in Azure Active Directory (Azure AD). In this article, you'll find the permissions required for some common scenarios related to app consent and permissions.
-## Required license plan
+## License requirements
-Using this feature requires an Azure AD Premium P1 license for your Azure AD organization. To find the right license for your requirements, see [Comparing generally available features of the Free, Basic, and Premium editions](https://azure.microsoft.com/pricing/details/active-directory/).
## App consent permissions
active-directory Custom Create https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-create.md
Previously updated : 01/05/2021 Last updated : 05/14/2021
This article describes how to create new custom roles in Azure Active Directory
Custom roles can be created in the [Roles and administrators](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/RolesAndAdministrators) tab on the Azure AD overview page.
+## Prerequisites
+
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+ ## Create a role in the Azure portal ### Create a new custom role to grant access to manage app registrations
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged role administrator or Global administrator permissions in the Azure AD organization.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory** > **Roles and administrators** > **New custom role**. ![Create or edit roles from the Roles and administrators page](./media/custom-create/new-custom-role.png)
Your custom role will show up in the list of available roles to assign.
## Create a role using PowerShell
-### Prepare PowerShell
-
-First, you must [download the Azure AD Preview PowerShell module](https://www.powershellgallery.com/packages/AzureADPreview).
-
-To install the Azure AD PowerShell module, use the following commands:
-
-``` PowerShell
-Install-Module -Name AzureADPreview
-Import-Module -Name AzureADPreview
-```
-
-To verify that the module is ready to use, use the following command:
-
-``` PowerShell
-Get-Module -Name AzureADPreview
-
- ModuleType Version Name ExportedCommands
- - - -
- Binary 2.0.0.115 AzureADPreview {Add-AzureADAdministrati...}
-```
- ### Connect to Azure To connect to Azure Active Directory, use the following command:
$rolePermissions = @{'allowedResourceActions'= $allowedResourceAction}
$customAdmin = New-AzureADMSRoleDefinition -RolePermissions $rolePermissions -DisplayName $displayName -Description $description -TemplateId $templateId -IsEnabled $true ```
-### Assign the custom role using Azure AD PowerShell
+### Assign the custom role using PowerShell
Assign the role using the below PowerShell script:
$resourceScope = '/' + $appRegistration.objectId
$roleAssignment = New-AzureADMSRoleAssignment -ResourceScope $resourceScope -RoleDefinitionId $roleDefinition.Id -PrincipalId $user.objectId ```
-## Create a role with Graph API
+## Create a role with the Microsoft Graph API
1. Create the role definition.
$roleAssignment = New-AzureADMSRoleAssignment -ResourceScope $resourceScope -Rol
Like built-in roles, custom roles are assigned by default at the default organization-wide scope to grant access permissions over all app registrations in your organization. But unlike built-in roles, custom roles can also be assigned at the scope of a single Azure AD resource. This allows you to give the user the permission to update credentials and basic properties of a single app without having to create a second custom role.
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Application developer permissions in the Azure AD organization.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Application Developer permissions.
1. Select **App registrations**. 1. Select the app registration to which you are granting access to manage. You might have to select **All applications** to see the complete list of app registrations in your Azure AD organization.
active-directory Custom Enterprise App Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-enterprise-app-permissions.md
This article contains the currently available enterprise application permissions for custom role definitions in Azure Active Directory (Azure AD). In this article, you'll find permission lists for some common scenarios and the full list of enterprise app permissions. Application Proxy permissions are not currently rolled out in this release.
-## Required license plan
+## License requirements
-Using this feature requires an Azure AD Premium P1 license for your Azure AD organization. To find the right license for your requirements, see [Comparing generally available features of the Free, Basic, and Premium editions](https://azure.microsoft.com/pricing/details/active-directory/).
## Enterprise application permissions
active-directory Custom Enterprise Apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-enterprise-apps.md
Previously updated : 04/14/2021 Last updated : 05/14/2021
This article explains how to create a custom role with permissions to manage enterprise app assignments for users and groups in Azure Active Directory (Azure AD). For the elements of roles assignments and the meaning of terms such as subtype, permission, and property set, see the [custom roles overview](custom-overview.md).
+## Prerequisites
+
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+ ## Enterprise app role permissions There are two enterprise app permissions discussed in this article. All examples use the update permission.
Granting the update permission is done in two steps:
1. Create a custom role with permission `microsoft.directory/servicePrincipals/appRoleAssignedTo/update` 1. Grant users or groups permissions to manage user and group assignments to enterprise apps. This is when you can set the scope to the organization-wide level or to a single application.
-## Use the Azure AD admin center
+## Azure portal
### Create a new custom role >[!NOTE] > Custom roles are created and managed at an organization-wide level and are available only from the organization's Overview page.
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged Role Administrator or Global Administrator permissions in your organization.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory**, select **Roles and administrators**, and then select **New custom role**. ![Add a new custom role from the roles list in Azure AD](./media/custom-enterprise-apps/new-custom-role.png)
Granting the update permission is done in two steps:
![Now you can create the custom role](./media/custom-enterprise-apps/role-custom-create.png)
-### Assign the role to a user using the Azure AD portal
+### Assign the role to a user using the Azure portal
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged Role administrator role permissions.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory** and then select **Roles and administrators**. 1. Select the **Grant permissions to manage user and group assignments** role.
Granting the update permission is done in two steps:
![Verify the user permissions](./media/custom-enterprise-apps/verify-user-permissions.png)
-## Use Azure AD PowerShell
+## PowerShell
For more detail, see [Create and assign a custom role](custom-create.md) and [Assign custom roles with resource scope using PowerShell](custom-assign-powershell.md).
-First, install the Azure AD PowerShell module from [the PowerShell Gallery](https://www.powershellgallery.com/packages/AzureADPreview/2.0.0.17). Then import the Azure AD PowerShell preview module, using the following command:
-
-```powershell
-Import-Module -Name AzureADPreview
-```
-
-To verify that the module is ready to use, match the version returned by the following command to the one listed here:
-
-```powershell
-Get-Module -Name AzureADPreview
- ModuleType Version Name ExportedCommands
- - - -
- Binary 2.0.0.115 AzureADPreview {Add-AzureADAdministrati...}
-```
- ### Create a custom role Create a new role using the following PowerShell script:
$resourceScope = '/' + $appRegistration.objectId
$roleAssignment = New-AzureADMSRoleAssignment -ResourceScope $resourceScope -RoleDefinitionId $roleDefinition.Id -PrincipalId $user.objectId ```
-## Use the Microsoft Graph API
+## Microsoft Graph API
Create a custom role using the provided example in the Microsoft Graph API. For more detail, see [Create and assign a custom role](custom-create.md) and [Assign custom admin roles using the Microsoft Graph API](custom-assign-graph.md).
https://graph.microsoft.com/beta/roleManagement/directory/roleDefinitionsIsEnabl
} ```
-### Assign the custom role using Microsoft Graph API
+### Assign the custom role using the Microsoft Graph API
The role assignment combines a security principal ID (which can be a user or service principal), a role definition ID, and an Azure AD resource scope. For more information on the elements of a role assignment, see the [custom roles overview](custom-overview.md)
active-directory Custom Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-overview.md
A role definition, or role, is a collection of permissions. A role definition li
A scope is the restriction of permitted actions to a particular Azure AD resource as part of a role assignment. When you assign a role, you can specify a scope that limits the administrator's access to a specific resource. For example, if you want to grant a developer a custom role, but only to manage a specific application registration, you can include the specific application registration as a scope in the role assignment.
-## Required license plan
+## License requirements
-Using built-in roles in Azure AD is free, while custom roles requires an Azure AD Premium P1 license. To find the right license for your requirements, see [Comparing generally available features of the Free, Basic, and Premium editions](https://azure.microsoft.com/pricing/details/active-directory).
+Using built-in roles in Azure AD is free, while custom roles requires an Azure AD Premium P1 license. To find the right license for your requirements, see [Comparing generally available features of the Free and Premium editions](https://azure.microsoft.com/pricing/details/active-directory/).
## Next steps
active-directory Delegate App Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/delegate-app-roles.md
Creating custom roles and assigning custom roles are separate steps:
This separation allows you to create a single role definition and then assign it many times at different *scopes*. A custom role can be assigned at organization-wide scope, or it can be assigned at the scope if a single Azure AD object. An example of an object scope is a single app registration. Using different scopes, the same role definition can be assigned to Sally over all app registrations in the organization and then to Naveen over only the Contoso Expense Reports app registration. Tips when creating and using custom roles for delegating application management:-- Custom roles only grant access in the most current app registration blades of the Azure AD portal. They do not grant access in the legacy app registrations blades.-- Custom roles do not grant access to the Azure AD portal when the ΓÇ£Restrict access to Azure AD administration portalΓÇ¥ user setting is set to Yes.
+- Custom roles only grant access in the most current app registration blades of the Azure portal. They do not grant access in the legacy app registrations blades.
+- Custom roles do not grant access to the Azure portal when the ΓÇ£Restrict access to Azure AD administration portalΓÇ¥ user setting is set to Yes.
- App registrations the user has access to using role assignments only show up in the ΓÇÿAll applicationsΓÇÖ tab on the App registration page. They do not show up in the ΓÇÿOwned applicationsΓÇÖ tab. For more information on the basics of custom roles, see the [custom roles overview](custom-overview.md), as well as how to [create a custom role](custom-create.md) and how to [assign a role](custom-assign-powershell.md).
active-directory Groups Assign Role https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-assign-role.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
This section describes how an IT admin can assign Azure Active Directory (Azure AD) role to an Azure AD group.
-## Using Azure AD admin center
+## Prerequisites
+
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+
+## Azure portal
Assigning a group to an Azure AD role is similar to assigning users and service principals except that only groups that are role-assignable can be used. In the Azure portal, only groups that are role-assignable are displayed.
-1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) with Privileged role administrator or Global administrator permissions in the Azure AD organization.
+1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview).
1. Select **Azure Active Directory** > **Roles and administrators**, and select the role you want to assign.
Assigning a group to an Azure AD role is similar to assigning users and service
For more information on assigning role permissions, see [Assign administrator and non-administrator roles to users](../fundamentals/active-directory-users-assign-role-azure-portal.md).
-## Using PowerShell
+## PowerShell
### Create a group that can be assigned to role
$roleDefinition = Get-AzureADMSRoleDefinition -Filter "displayName eq 'Helpdesk
$roleAssignment = New-AzureADMSRoleAssignment -ResourceScope '/' -RoleDefinitionId $roleDefinition.Id -PrincipalId $group.Id ```
-## Using Microsoft Graph API
+## Microsoft Graph API
### Create a group that can be assigned Azure AD role
active-directory Groups Concept https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-concept.md
The following scenarios are not supported right now:
We are fixing these issues.
-## Required license plan
+## License requirements
-Using this feature requires you to have an available Azure AD Premium P1 license in your Azure AD organization. To use also Privileged Identity Management for just-in-time role activation requires you to have an available Azure AD Premium P2 license. To find the right license for your requirements, see [Comparing generally available features of the Free and Premium plans](../fundamentals/active-directory-whatis.md#what-are-the-azure-ad-licenses).
+Using this feature requires an Azure AD Premium P1 license. To also use Privileged Identity Management for just-in-time role activation requires an Azure AD Premium P2 license. To find the right license for your requirements, see [Comparing generally available features of the Free and Premium editions](https://azure.microsoft.com/pricing/details/active-directory/).
## Next steps
active-directory Groups Create Eligible https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-create-eligible.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
# Create a role-assignable group in Azure Active Directory
-You can only assign a role to a group that was created with the ΓÇÿisAssignableToRoleΓÇÖ property set to True, or was created in the Azure AD portal with **Azure AD roles can be assigned to the group** turned on. This group attribute makes the group one that can be assigned to a role in Azure Active Directory (Azure AD). This article describes how to create this special kind of group. **Note:** A group with isAssignableToRole property set to true cannot be of dynamic membership type. For more information, see [Using a group to manage Azure AD role assignments](groups-concept.md).
+You can only assign a role to a group that was created with the ΓÇÿisAssignableToRoleΓÇÖ property set to True, or was created in the Azure portal with **Azure AD roles can be assigned to the group** turned on. This group attribute makes the group one that can be assigned to a role in Azure Active Directory (Azure AD). This article describes how to create this special kind of group. **Note:** A group with isAssignableToRole property set to true cannot be of dynamic membership type. For more information, see [Using a group to manage Azure AD role assignments](groups-concept.md).
-## Using Azure AD admin center
+## Prerequisites
-1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) with Privileged role administrator or Global administrator permissions in the Azure AD organization.
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+
+## Azure portal
+
+1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview).
1. Select **Groups** > **All groups** > **New group**. [![Open Azure Active Directory and create a new group.](./media/groups-create-eligible/new-group.png "Open Azure Active Directory and create a new group.")](./media/groups-create-eligible/new-group.png#<lightbox>)
You can only assign a role to a group that was created with the ΓÇÿisAssignableT
The group is created with any roles you might have assigned to it.
-## Using PowerShell
-
-### Install the Azure AD preview module
-
-```powershell
-Install-Module -Name AzureADPreview
-Import-Module -Name AzureADPreview
-```
-
-To verify that the module is ready to use, issue the following command:
-
-```powershell
-Get-Module -Name AzureADPreview
-```
+## PowerShell
### Create a group that can be assigned to role
Add-AzureADGroupMember -ObjectId $roleAssignablegroup.Id -RefObjectId $member.Ob
} ```
-## Using Microsoft Graph API
+## Microsoft Graph API
### Create a role-assignable group in Azure AD
active-directory Groups Pim Eligible https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-pim-eligible.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
This article describes how you can assign an Azure Active Directory (Azure AD) r
> [!NOTE] > You must be using the updated version of Privileged Identity Management to be able to assign a group to an Azure AD role using PIM. You might be on older version of PIM if your Azure AD organization leverages the Privileged Identity Management API. If so, please reach out to the alias pim_preview@microsoft.com to move your organization and update your API. Learn more at [Azure AD roles and features in PIM](../privileged-identity-management/azure-ad-roles-features.md).
-## Using Azure AD admin center
+## Prerequisites
-1. Sign in to [Azure AD Privileged Identity Management](https://ms.portal.azure.com/?Microsoft_AAD_IAM_GroupRoles=true&Microsoft_AAD_IAM_userRolesV2=true&Microsoft_AAD_IAM_enablePimIntegration=true#blade/Microsoft_Azure_PIMCommon/CommonMenuBlade/quickStart) as a Privileged role administrator or Global administrator in your organization.
+- Azure AD Premium P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+
+## Azure portal
+
+1. Sign in to [Azure AD Privileged Identity Management](https://ms.portal.azure.com/?Microsoft_AAD_IAM_GroupRoles=true&Microsoft_AAD_IAM_userRolesV2=true&Microsoft_AAD_IAM_enablePimIntegration=true#blade/Microsoft_Azure_PIMCommon/CommonMenuBlade/quickStart).
1. Select **Privileged Identity Management** > **Azure AD roles** > **Roles** > **Add assignments**
This article describes how you can assign an Azure Active Directory (Azure AD) r
![select the user to whom you're assigning the role](./media/groups-pim-eligible/set-assignment-settings.png)
-## Using PowerShell
-
-### Download the Azure AD Preview PowerShell module
-
-To install the Azure AD #PowerShell module, use the following cmdlets:
-
-```powershell
-Install-Module -Name AzureADPreview
-Import-Module -Name AzureADPreview
-```
-
-To verify that the module is ready to use, use the following cmdlet:
-
-```powershell
-Get-Module -Name AzureADPreview
-```
+## PowerShell
### Assign a group as an eligible member of a role
$schedule.endDateTime = "2019-07-25T20:49:11.770Z"
Open-AzureADMSPrivilegedRoleAssignmentRequest -ProviderId aadRoles -Schedule $schedule -ResourceId "[YOUR TENANT ID]" -RoleDefinitionId "9f8c1837-f885-4dfd-9a75-990f9222b21d" -SubjectId "[YOUR GROUP ID]" -AssignmentState "Eligible" -Type "AdminAdd" ```
-## Using Microsoft Graph API
+## Microsoft Graph API
```http POST
active-directory Groups Remove Assignment https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-remove-assignment.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
This article describes how an IT admin can remove Azure AD roles assigned to groups. In the Azure portal, you can now remove both direct and indirect role assignments to a user. If a user is assigned a role by a group membership, remove the user from the group to remove the role assignment.
-## Using Azure admin center
+## Prerequisites
-1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) with Privileged role administrator or Global administrator permissions in the Azure AD organization.
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+
+## Azure portal
+
+1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview).
1. Select **Roles and administrators** > ***role name***.
This article describes how an IT admin can remove Azure AD roles assigned to gro
1. When asked to confirm your action, select **Yes**.
-## Using PowerShell
+## PowerShell
### Create a group that can be assigned to role
$roleAssignment = New-AzureADMSRoleAssignment -ResourceScope '/' -RoleDefinition
Remove-AzureAdMSRoleAssignment -Id $roleAssignment.Id ```
-## Using Microsoft Graph API
+## Microsoft Graph API
### Create a group that can be assigned an Azure AD role
active-directory Groups View Assignments https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-view-assignments.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
This section describes how the roles assigned to a group can be viewed using Azure AD admin center. Viewing groups and assigned roles are default user permissions.
-1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) with any non-admin or admin credentials.
+## Prerequisites
+
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+
+## Azure portal
+
+1. Sign in to the [Azure AD admin center](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview).
1. Select the group that you are interested in.
This section describes how the roles assigned to a group can be viewed using Azu
![View all roles assigned to a selected group](./media/groups-view-assignments/view-assignments.png)
-## Using PowerShell
+## PowerShell
### Get object ID of the group
Get-AzureADMSGroup -SearchString "Contoso_Helpdesk_Administrators"
Get-AzureADMSRoleAssignment -Filter "principalId eq '<object id of group>" ```
-## Using Microsoft Graph API
+## Microsoft Graph API
### Get object ID of the group
active-directory Manage Roles Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/manage-roles-portal.md
Previously updated : 03/07/2021 Last updated : 05/14/2021
You can now see and manage all the members of the administrator roles in the Azure AD admin center. If you frequently manage role assignments, you will probably prefer this experience. This article describes how to assign Azure AD roles using the Azure AD admin center.
+## Prerequisites
+
+- Privileged Role Administrator or Global Administrator
+- Azure AD Premium P2 license when using Privileged Identity Management (PIM)
+ ## Assign a role
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Global Administrator or Privileged Role Administrator permissions.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory**.
You can now see and manage all the members of the administrator roles in the Azu
## Privileged Identity Management (PIM)
-You can select **Manage in PIM** for additional management capabilities using [Azure AD Privileged Identity Management (PIM)](../privileged-identity-management/pim-configure.md). Privileged Role Administrators can change ΓÇ£PermanentΓÇ¥ (always active in the role) assignments to ΓÇ£EligibleΓÇ¥ (in the role only when elevated). If you don't have Privileged Identity Management, you can still select **Manage in PIM** to sign up for a trial. Privileged Identity Management requires an [Azure AD Premium P2 license plan](../privileged-identity-management/subscription-requirements.md).
+You can select **Manage in PIM** for additional management capabilities using [Azure AD Privileged Identity Management (PIM)](../privileged-identity-management/pim-configure.md). Privileged Role Administrators can change ΓÇ£PermanentΓÇ¥ (always active in the role) assignments to ΓÇ£EligibleΓÇ¥ (in the role only when elevated). If you don't have Privileged Identity Management, you can still select **Manage in PIM** to sign up for a trial. Privileged Identity Management requires an [Azure AD Premium P2 license](../privileged-identity-management/subscription-requirements.md).
![Screenshot that shows the "User administrator - Assignments" page with the "Manage in PIM" action selected](./media/manage-roles-portal/member-list-pim.png)
active-directory Permissions Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/permissions-reference.md
Windows Defender ATP and EDR | Assign roles<br>Manage machine groups<br>Configur
Users with this role can manage alerts and have global read-only access on security-related features, including all information in Microsoft 365 security center, Azure Active Directory, Identity Protection, Privileged Identity Management and Office 365 Security & Compliance Center. More information about Office 365 permissions is available at [Permissions in the Security & Compliance Center](/office365/securitycompliance/permissions-in-the-security-and-compliance-center).
-In | Can do
- |
-[Microsoft 365 security center](https://protection.office.com) | All permissions of the Security Reader role<br>View, investigate, and respond to security threats alerts
-Azure AD Identity Protection | All permissions of the Security Reader role<br>Additionally, the ability to perform all Identity Protection Center operations except for resetting passwords and configuring alert e-mails.
-[Privileged Identity Management](../privileged-identity-management/pim-configure.md) | All permissions of the Security Reader role
-[Office 365 Security & Compliance Center](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) | All permissions of the Security Reader role<br>View, investigate, and respond to security alerts
-Windows Defender ATP and EDR | All permissions of the Security Reader role<br>View, investigate, and respond to security alerts
-[Intune](/intune/role-based-access-control) | All permissions of the Security Reader role
-[Cloud App Security](/cloud-app-security/manage-admins) | All permissions of the Security Reader role
-[Microsoft 365 service health](/office365/enterprise/view-service-health) | View the health of Microsoft 365 services
+| In | Can do |
+| | |
+| [Microsoft 365 security center](https://protection.office.com) | All permissions of the Security Reader role<br/>View, investigate, and respond to security threats alerts<br/>Manage security settings in security center |
+| [Azure AD Identity Protection](../identity-protection/overview-identity-protection.md) | All permissions of the Security Reader role<br>Additionally, the ability to perform all Identity Protection Center operations except for resetting passwords and configuring alert e-mails. |
+| [Privileged Identity Management](../privileged-identity-management/pim-configure.md) | All permissions of the Security Reader role |
+| [Office 365 Security & Compliance Center](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) | All permissions of the Security Reader role<br>View, investigate, and respond to security alerts |
+| Windows Defender ATP and EDR | All permissions of the Security Reader role<br>View, investigate, and respond to security alerts |
+| [Intune](/intune/role-based-access-control) | All permissions of the Security Reader role |
+| [Cloud App Security](/cloud-app-security/manage-admins) | All permissions of the Security Reader role |
+| [Microsoft 365 service health](/microsoft-365/enterprise/view-service-health) | View the health of Microsoft 365 services |
> [!div class="mx-tableFixed"] > | Actions | Description |
active-directory Prerequisites https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/prerequisites.md
+
+ Title: Prerequisites to use PowerShell or Graph Explorer for Azure AD roles - Azure Active Directory
+description: Prerequisites to use PowerShell or Graph Explorer for Azure Active Directory roles.
+
+documentationcenter: ''
++++++ Last updated : 05/13/2021++++++
+# Prerequisites to use PowerShell or Graph Explorer for Azure AD roles
+
+If you want to manage Azure Active Directory (Azure AD) roles using PowerShell or Graph Explorer, you must have the required prerequisites. This article describes the PowerShell and Graph Explorer prerequisites for different Azure AD role features.
+
+## AzureAD module
+
+To use PowerShell commands to do the following:
+
+- List role assignments
+- Create a role-assignable group
+- Manage administrative units
+
+You must have the following module installed:
+
+- [AzureAD](https://www.powershellgallery.com/packages/AzureAD) version 2.0.2.130 or later
++
+#### Check AzureAD version
+
+To check which version of AzureAD you have installed, use [Get-InstalledModule](/powershell/module/powershellget/get-installedmodule).
+
+```powershell
+Get-InstalledModule -Name AzureAD
+```
+
+You should see output similar to the following:
+
+```powershell
+Version Name Repository Description
+- - - --
+2.0.2.130 AzureAD PSGallery Azure Active Directory V2 General Availability M...
+```
+
+#### Install AzureAD
+
+If you don't have AzureAD installed, use [Install-Module](/powershell/module/powershellget/install-module) to install AzureAD.
+
+```powershell
+Install-Module -Name AzureAD
+```
+
+#### Update AzureAD
+
+To update AzureAD to the latest version, re-run [Install-Module](/powershell/module/powershellget/install-module).
+
+```powershell
+Install-Module -Name AzureAD
+```
+
+#### Use AzureAD
+
+To use AzureAD, follow these steps to make sure it is imported into the current session.
+
+1. Use [Get-Module](/powershell/module/microsoft.powershell.core/get-module) to check if AzureAD is loaded into memory.
+
+ ```powershell
+ Get-Module -Name AzureAD
+ ```
+
+1. If you don't see any output in the previous step, use [Import-Module](/powershell/module/powershellget/import-module) to import AzureAD. The `-Force` parameter removes the loaded module and then imports it again.
+
+ ```powershell
+ Import-Module -Name AzureAD -Force
+ ```
+
+1. Run [Get-Module](/powershell/module/microsoft.powershell.core/get-module) again.
+
+ ```powershell
+ Get-Module -Name AzureAD
+ ```
+
+ You should see output similar to the following:
+
+ ```powershell
+ ModuleType Version Name ExportedCommands
+ - - - -
+ Binary 2.0.2.130 AzureAD {Add-AzureADApplicationOwner, Add-AzureADDeviceRegisteredO...
+ ```
+
+## AzureADPreview module
+
+To use PowerShell commands to do the following:
+
+- Assign roles to users or groups
+- Remove a role assignment
+- Make a group eligible for a role using Privileged Identity Management
+- Create custom roles
+
+You must have the following module installed:
+
+- [AzureADPreview](https://www.powershellgallery.com/packages/AzureADPreview) version 2.0.2.129 or later
++
+#### Check AzureADPreview version
+
+To check which version of AzureADPreview you have installed, use [Get-InstalledModule](/powershell/module/powershellget/get-installedmodule).
+
+```powershell
+Get-InstalledModule -Name AzureADPreview
+```
+
+You should see output similar to the following:
+
+```powershell
+Version Name Repository Description
+- - - --
+2.0.2.129 AzureADPreview PSGallery Azure Active Directory V2 Preview Module. ...
+```
+
+#### Install AzureADPreview
+
+If you don't have AzureADPreview installed, use [Install-Module](/powershell/module/powershellget/install-module) to install AzureADPreview.
+
+```powershell
+Install-Module -Name AzureADPreview
+```
+
+#### Update AzureADPreview
+
+To update AzureADPreview to the latest version, re-run [Install-Module](/powershell/module/powershellget/install-module).
+
+```powershell
+Install-Module -Name AzureADPreview
+```
+
+#### Use AzureADPreview
+
+To use AzureADPreview, follow these steps to make sure it is imported into the current session.
+
+1. Use [Get-Module](/powershell/module/microsoft.powershell.core/get-module) to check if AzureADPreview is loaded into memory.
+
+ ```powershell
+ Get-Module -Name AzureADPreview
+ ```
+
+1. If you don't see any output in the previous step, use [Import-Module](/powershell/module/powershellget/import-module) to import AzureADPreview. The `-Force` parameter removes the loaded module and then imports it again.
+
+ ```powershell
+ Import-Module -Name AzureADPreview -Force
+ ```
+
+1. Run [Get-Module](/powershell/module/microsoft.powershell.core/get-module) again.
+
+ ```powershell
+ Get-Module -Name AzureADPreview
+ ```
+
+ You should see output similar to the following:
+
+ ```powershell
+ ModuleType Version Name ExportedCommands
+ - - - -
+ Binary 2.0.2.129 AzureADPreview {Add-AzureADAdministrativeUnitMember, Add-AzureADApplicati...
+ ```
+
+## Graph Explorer
+
+To manage Azure AD roles using the [Microsoft Graph API](/graph/overview) and [Graph Explorer](/graph/graph-explorer/graph-explorer-overview), you must do the following:
+
+1. In the Azure portal, open **Azure Active Directory**.
+
+1. Click **Enterprise applications**.
+
+1. In the applications list, find and click **Graph explorer**.
+
+1. Click **Permissions**.
+
+1. Click **Grant admin consent for Graph explorer**.
+
+ ![Screenshot showing the "Grant admin consent for Graph explorer" link.](./media/prerequisites/select-graph-explorer.png)
+
+1. Use [Graph Explorer tool](https://aka.ms/ge).
+
+## Next steps
+
+- [Install Azure Active Directory PowerShell for Graph](/powershell/azure/active-directory/install-adv2)
+- [AzureAD module docs](/powershell/module/azuread/)
+- [Graph Explorer](/graph/graph-explorer/graph-explorer-overview)
active-directory Quickstart App Registration Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/quickstart-app-registration-limits.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
# Quickstart: Grant permission to create unlimited app registrations
-In this quick start guide, you will create a custom role with permission to create an unlimited number of app registrations, and then assign that role to a user. The assigned user can then use the Azure AD portal, Azure AD PowerShell, or Microsoft Graph API to create application registrations. Unlike the built-in Application Developer role, this custom role grants the ability to create an unlimited number of application registrations. The Application Developer role grants the ability, but the total number of created objects is limited to 250 to prevent hitting [the directory-wide object quota](../enterprise-users/directory-service-limits-restrictions.md). The least privileged role required to create and assign Azure AD custom roles is the Privileged Role administrator.
+In this quick start guide, you will create a custom role with permission to create an unlimited number of app registrations, and then assign that role to a user. The assigned user can then use the Azure portal, Azure AD PowerShell, or Microsoft Graph API to create application registrations. Unlike the built-in Application Developer role, this custom role grants the ability to create an unlimited number of application registrations. The Application Developer role grants the ability, but the total number of created objects is limited to 250 to prevent hitting [the directory-wide object quota](../enterprise-users/directory-service-limits-restrictions.md). The least privileged role required to create and assign Azure AD custom roles is the Privileged Role administrator.
If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
-## Create a custom role using the Azure AD portal
+## Prerequisites
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged Role administrator or Global administrator permissions in the Azure AD organization.
+- Azure AD Premium P1 or P2 license
+- Privileged Role Administrator or Global Administrator
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+
+## Azure portal
+
+### Create a custom role
+
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory**, select **Roles and administrators**, and then select **New custom role**. ![Create or edit roles from the Roles and administrators page](./media/quickstart-app-registration-limits/new-custom-role.png)
If you don't have an Azure subscription, [create a free account](https://azure.m
1. On the **Review + create** tab, review the permissions and select **Create**.
-### Assign the role in the Azure AD portal
+### Assign the role
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged role administrator or Global administrator permissions in your Azure AD organization.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory** and then select **Roles and administrators**. 1. Select the Application Registration Creator role and select **Add assignment**. 1. Select the desired user and click **Select** to add the user to the role.
If you don't have an Azure subscription, [create a free account](https://azure.m
Done! In this quickstart, you successfully created a custom role with permission to create an unlimited number of app registrations, and then assign that role to a user. > [!TIP]
-> To assign the role to an application using the Azure AD portal, enter the name of the application into the search box of the assignment page. Applications are not shown in the list by default, but are returned in search results.
+> To assign the role to an application using the Azure portal, enter the name of the application into the search box of the assignment page. Applications are not shown in the list by default, but are returned in search results.
### App registration permissions
There are two permissions available for granting the ability to create applicati
- microsoft.directory/applications/createAsOwner: Assigning this permission results in the creator being added as the first owner of the created app registration, and the created app registration will count against the creator's 250 created objects quota. - microsoft.directory/applications/create: Assigning this permission results in the creator not being added as the first owner of the created app registration, and the created app registration will not count against the creator's 250 created objects quota. Use this permission carefully, because there is nothing preventing the assignee from creating app registrations until the directory-level quota is hit. If both permissions are assigned, this permission takes precedence.
-## Create a custom role in Azure AD PowerShell
-
-### Prepare PowerShell
-
-First, install the Azure AD PowerShell module from the [PowerShell Gallery](https://www.powershellgallery.com/packages/AzureADPreview/2.0.0.17). Then import the Azure AD PowerShell preview module, using the following command:
-
-```powershell
-Import-Module -Name AzureADPreview
-```
+## PowerShell
-To verify that the module is ready to use, match the version returned by the following command to the one listed here:
-
-```powershell
-Get-Module -Name AzureADPreview
- ModuleType Version Name ExportedCommands
- - - -
- Binary 2.0.0.115 AzureADPreview {Add-AzureADAdministrati...}
-```
-
-### Create the custom role in Azure AD PowerShell
+### Create a custom role
Create a new role using the following PowerShell script:
$rolePermissions = @{'allowedResourceActions'= $allowedResourceAction}
$customRole = New-AzureAdMSRoleDefinition -RolePermissions $rolePermissions -DisplayName $displayName -Description $description -TemplateId $templateId -IsEnabled $true ```
-### Assign the role in Azure AD PowerShell
+### Assign the role
Assign the role using the following PowerShell script:
$resourceScope = '/'
$roleAssignment = New-AzureADMSRoleAssignment -ResourceScope $resourceScope -RoleDefinitionId $roleDefinition.Id -PrincipalId $user.objectId ```
-## Create a custom role in the Microsoft Graph API
+## Microsoft Graph API
+
+### Create a custom role
HTTP request to create the custom role.
Body
} ```
-### Assign the role in the Microsoft Graph API
+### Assign the role
The role assignment combines a security principal ID (which can be a user or service principal), a role definition (role) ID, and an Azure AD resource scope.
active-directory Role Definitions List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/role-definitions-list.md
This article describes how to list the Azure AD built-in and custom roles along
1. Select **Roles and administrators** to see the list of all available roles.
- ![list of roles in Azure AD portal](./media/role-definitions-list/view-roles-in-azure-active-directory.png)
+ ![list of roles in Azure portal](./media/role-definitions-list/view-roles-in-azure-active-directory.png)
1. On the right, select the ellipsis and then **Description** to see the complete list of permissions for a role.
active-directory Security Planning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/security-planning.md
Securing privileged access requires changes to:
Secure your privileged access in a way that is managed and reported in the Microsoft services you care about. If you have on-premises admin accounts, see the guidance for on-premises and hybrid privileged access in Active Directory at [Securing Privileged Access](/windows-server/identity/securing-privileged-access/securing-privileged-access). > [!NOTE]
-> The guidance in this article refers primarily to features of Azure Active Directory that are included in Azure Active Directory Premium plans P1 and P2. Azure Active Directory Premium P2 is included in the EMS E5 suite and Microsoft 365 E5 suite. This guidance assumes your organization already has Azure AD Premium P2 licenses purchased for your users. If you do not have these licenses, some of the guidance might not apply to your organization. Also, throughout this article, the term Global Administrator means the same thing as "company administrator" or "tenant administrator."
+> The guidance in this article refers primarily to features of Azure Active Directory that are included in Azure AD Premium P1 and P2. Azure AD Premium P2 is included in the EMS E5 suite and Microsoft 365 E5 suite. This guidance assumes your organization already has Azure AD Premium P2 licenses purchased for your users. If you do not have these licenses, some of the guidance might not apply to your organization. Also, throughout this article, the term Global Administrator means the same thing as "company administrator" or "tenant administrator."
## Develop a roadmap
active-directory View Assignments https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/view-assignments.md
Previously updated : 11/05/2020 Last updated : 05/14/2021
This article describes how to list roles you have assigned in Azure Active Direc
- Role assignments at the organization-wide scope are added to and can be seen in the list of single application role assignments. - Role assignments at the single application scope aren't added to and can't be seen in the list of organization-wide scoped assignments.
+## Prerequisites
+
+- AzureADPreview module when using PowerShell
+- Admin consent when using Graph explorer for Microsoft Graph API
+
+For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
+ ## List role assignments in the Azure portal This procedure describes how to list role assignments with organization-wide scope.
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged role administrator or Global administrator permissions in the Azure AD organization.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **Azure Active Directory**, select **Roles and administrators**, and then select a role to open it and view its properties. 1. Select **Assignments** to list the role assignments.
To download all assignments for a specific role, on the **Roles and administrato
![download all assignments for a role](./media/view-assignments/download-role-assignments.png)
-## List role assignments using Azure AD PowerShell
+## List role assignments using PowerShell
This section describes viewing assignments of a role with organization-wide scope. This article uses the [Azure Active Directory PowerShell Version 2](/powershell/module/azuread/#directory_roles) module. To view single-application scope assignments using PowerShell, you can use the cmdlets in [Assign custom roles with PowerShell](custom-assign-powershell.md).
-### Prepare PowerShell
-
-First, you must [download the Azure AD preview PowerShell module](https://www.powershellgallery.com/packages/AzureAD/).
-
-To install the Azure AD PowerShell module, use the following commands:
-
-``` PowerShell
-Install-Module -Name AzureADPreview
-Import-Module -Name AzureADPreview
-```
-
-To verify that the module is ready to use, use the following command:
-
-``` PowerShell
-Get-Module -Name AzureADPreview
- ModuleType Version Name ExportedCommands
- - - -
- Binary 2.0.0.115 AzureADPreview {Add-AzureADAdministrati...}
-```
-
-### List role assignments
- Example of listing the role assignments. ``` PowerShell
$role = Get-AzureADDirectoryRole -ObjectId "5b3fe201-fa8b-4144-b6f1-875829ff7543
Get-AzureADDirectoryRoleMember -ObjectId $role.ObjectId | Get-AzureADUser ```
-## List role assignments using Microsoft Graph API
+## List role assignments using the Microsoft Graph API
This section describes how to list role assignments with organization-wide scope. To list single-application scope role assignments using Graph API, you can use the operations in [Assign custom roles with Graph API](custom-assign-graph.md).
HTTP/1.1 200 OK
This section describes how to list role assignments with single-application scope. This feature is currently in public preview.
-1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com) with Privileged role administrator or Global administrator permissions in the Azure AD organization.
+1. Sign in to the [Azure AD admin center](https://aad.portal.azure.com).
1. Select **App registrations**, and then select the app registration to view its properties. You might have to select **All applications** to see the complete list of app registrations in your Azure AD organization. ![Create or edit app registrations from the App registrations page](./media/view-assignments/app-reg-all-apps.png)
active-directory Abintegro Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/abintegro-tutorial.md
Previously updated : 09/03/2019 Last updated : 05/17/2021
In this tutorial, you'll learn how to integrate Abintegro with Azure Active Dire
* Enable your users to be automatically signed-in to Abintegro with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Abintegro supports **SP** initiated SSO
+* Abintegro supports **SP** initiated SSO.
+
+* Abintegro supports **Just In Time** user provisioning.
-* Abintegro supports **Just In Time** user provisioning
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Abintegro from the gallery
+## Add Abintegro from the gallery
To configure the integration of Abintegro into Azure AD, you need to add Abintegro from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Abintegro** in the search box. 1. Select **Abintegro** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on for Abintegro
+## Configure and test Azure AD SSO for Abintegro
Configure and test Azure AD SSO with Abintegro using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Abintegro.
-To configure and test Azure AD SSO with Abintegro, complete the following building blocks:
+To configure and test Azure AD SSO with Abintegro, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Abintegro, complete the following buildi
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Abintegro** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Abintegro** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, perform the following step:
In the **Sign-on URL** text box, type a URL using the following pattern: `https://www.abintegro.com/Shibboleth.sso/Login?entityID=<Issuer>&target=https://www.abintegro.com/secure/`
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Abintegro**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure Abintegro SSO
In this section, a user called Britta Simon is created in Abintegro. Abintegro s
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the Abintegro tile in the Access Panel, you should be automatically signed in to the Abintegro for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional resources
+* Click on **Test this application** in Azure portal. This will redirect to Abintegro Sign-on URL where you can initiate the login flow.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Go to Abintegro Sign-on URL directly and initiate the login flow from there.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the Abintegro tile in the My Apps, this will redirect to Abintegro Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try Abintegro with Azure AD](https://aad.portal.azure.com/)
+Once you configure Abintegro you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Datacamp Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/datacamp-tutorial.md
Previously updated : 01/10/2020 Last updated : 05/17/2021
In this tutorial, you'll learn how to integrate DataCamp with Azure Active Direc
* Enable your users to be automatically signed-in to DataCamp with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* DataCamp supports **SP and IDP** initiated SSO
-* DataCamp supports **Just In Time** user provisioning
+* DataCamp supports **SP and IDP** initiated SSO.
+* DataCamp supports **Just In Time** user provisioning.
-## Adding DataCamp from the gallery
+## Add DataCamp from the gallery
To configure the integration of DataCamp into Azure AD, you need to add DataCamp from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **DataCamp** in the search box. 1. Select **DataCamp** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for DataCamp
+## Configure and test Azure AD SSO for DataCamp
Configure and test Azure AD SSO with DataCamp using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in DataCamp.
-To configure and test Azure AD SSO with DataCamp, complete the following building blocks:
+To configure and test Azure AD SSO with DataCamp, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure DataCamp SSO](#configure-datacamp-sso)** - to configure the single sign-on settings on application side.
- * **[Create DataCamp test user](#create-datacamp-test-user)** - to have a counterpart of B.Simon in DataCamp that is linked to the Azure AD representation of user.
+ 1. **[Create DataCamp test user](#create-datacamp-test-user)** - to have a counterpart of B.Simon in DataCamp that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **DataCamp** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **DataCamp** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
a. In the **Identifier** text box, type a URL using the following pattern: `https://www.datacamp.com/groups/<group-identifier>/sso/saml`
Follow these steps to enable Azure AD SSO in the Azure portal.
1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- In the **Sign-on URL** text box, type a URL using the following pattern:
+ In the **Sign-on URL** text box, type the URL:
`https://www.datacamp.com/users/sign_in` > [!NOTE]
- > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [DataCamp Client support team](https://support.datacamp.com/hc/en-us) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [DataCamp Client support team](https://support.datacamp.com/hc/en-us) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
1. DataCamp application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **DataCamp**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure DataCamp SSO
In this section, a user called B.Simon is created in DataCamp. DataCamp supports
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to DataCamp Sign on URL where you can initiate the login flow.
-When you click the DataCamp tile in the Access Panel, you should be automatically signed in to the DataCamp for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Go to DataCamp Sign-on URL directly and initiate the login flow from there.
-## Additional resources
+#### IDP initiated:
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the DataCamp for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the DataCamp tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the DataCamp for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try DataCamp with Azure AD](https://aad.portal.azure.com/)
+Once you configure DataCamp you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Degreed Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/degreed-tutorial.md
Previously updated : 08/27/2020 Last updated : 05/14/2021 # Tutorial: Azure Active Directory integration with Degreed
-In this tutorial, you learn how to integrate Degreed with Azure Active Directory (Azure AD).
-Integrating Degreed with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Degreed with Azure Active Directory (Azure AD). When you integrate Degreed with Azure AD, you can:
-* You can control in Azure AD who has access to Degreed.
-* You can enable your users to be automatically signed-in to Degreed (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Degreed.
+* Enable your users to be automatically signed-in to Degreed with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Degreed, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Degreed single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Degreed single sign-on (SSO) enabled subscription.
> [!NOTE] > This integration is also available to use from Azure AD US Government Cloud environment. You can find this application in the Azure AD US Government Cloud Application Gallery and configure it in the same way as you do from public cloud.
To configure Azure AD integration with Degreed, you need the following items:
In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Degreed supports **SP** initiated SSO
-
-* Degreed supports **Just In Time** user provisioning
+* Degreed supports **SP** initiated SSO.
-* Once you configure Degreed you can enforce Session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad)
+* Degreed supports **Just In Time** user provisioning.
-## Adding Degreed from the gallery
+## Add Degreed from the gallery
To configure the integration of Degreed into Azure AD, you need to add Degreed from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**.
To configure the integration of Degreed into Azure AD, you need to add Degreed f
## Configure and test Azure AD SSO
-In this section, you configure and test Azure AD single sign-on with Degreed based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Degreed needs to be established.
+Configure and test Azure AD SSO with Degreed using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Degreed.
-To configure and test Azure AD single sign-on with Degreed, you need to complete the following building blocks:
+To configure and test Azure AD SSO with Degreed, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-2. **[Configure Degreed SSO](#configure-degreed-sso)** - to configure the Single Sign-On settings on application side.
- * **[Create Degreed test user](#create-degreed-test-user)** - to have a counterpart of Britta Simon in Degreed that is linked to the Azure AD representation of user.
-3. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Degreed SSO](#configure-degreed-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Degreed test user](#create-degreed-test-user)** - to have a counterpart of B.Simon in Degreed that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Degreed** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Degreed** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png) 4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Degreed Domain and URLs single sign-on information](common/sp-identifier.png)
- a. In the **Sign on URL** text box, type a URL using the following pattern: `https://degreed.com/?orgsso=<company code>`
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
- ### Create an Azure AD test user
-In this section, you'll create a test user named B.Simon in the Azure portal.
+In this section, you'll create a test user in the Azure portal called B.Simon.
-1. In the left pane of the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
-1. At the top of the screen, select **New user**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
1. In the **User** properties, follow these steps:
- 1. In the **Name** field, enter **B.Simon**.
- 1. In the **User name** field, enter `<username>@<companydomain>.<extension>`. For example: `B.Simon@contoso.com`.
- 1. Select the **Show password** check box, and then make note of the value that's displayed in the **Password** box.
- 1. Select **Create**.
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Degreed**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ### Configure Degreed SSO
To configure single sign-on on **Degreed** side, you need to send the downloaded
### Create Degreed test user
-The objective of this section is to create a user called Britta Simon in Degreed. Degreed supports just-in-time provisioning, which is by default enabled.
-
-There is no action item for you in this section. A new user is created during an attempt to access Degreed if it doesn't exist yet.
+In this section, a user called B.Simon is created in Degreed. Degreed supports just-in-time user provisioning, which is enabled by default. There's no action item for you in this section. If a user doesn't already exist in Degreed, a new one is created after authentication.
> [!NOTE] > If you need to create a user manually, you need to contact the [Degreed support team](mailto:sso@degreed.com). - ## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Degreed tile in the Access Panel, you should be automatically signed in to the Degreed for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Degreed Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Degreed Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Degreed tile in the My Apps, this will redirect to Degreed Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Degreed you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Displayr Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/displayr-tutorial.md
To learn more about SaaS app integration with Azure AD, see [What is application
To get started, you need the following items: * An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
-* Displayr single sign-on (SSO) enabled subscription.
+* Displayr single sign-on (SSO) enabled company.
## Scenario description
-In this tutorial, you configure and test Azure AD SSO in a test environment. Displayr supports **SP** initiated SSO.
+In this tutorial, you will learn to configure Azure AD SSO in your Displayr company. Displayr supports **SP** initiated SSO.
## Adding Displayr from the gallery
To configure the integration of Displayr into Azure AD, you need to add Displayr
1. In the **Add from the gallery** section, type **Displayr** in the search box. 1. Select **Displayr** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on
+## Configure Azure AD single sign-on
-Configure and test Azure AD SSO with Displayr using a test user called **Britta Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Displayr.
-
-To configure and test Azure AD SSO with Displayr, complete the following building blocks:
+To configure Azure AD SSO with Displayr, complete the following building blocks:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** to enable your users to use this feature. 2. **[Configure Displayr](#configure-displayr)** to configure the SSO settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Displayr test user](#create-displayr-test-user)** to have a counterpart of Britta Simon in Displayr that is linked to the Azure AD representation of user.
+4. **[Restrict access to specific users](#restrict-access-to-specific-users)** to restrict which of your Azure AD users can sign in to Displayr.
6. **[Test SSO](#test-sso)** to verify whether the configuration works. ### Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
c. Select **Source Attribute** of **Group ID**.
- d. Check **Customize the name of the group claim**.
-
- e. Check **Emit groups as role claims**.
- f. Click **Save**. 1. On the **Set-up Displayr** section, copy the appropriate URL(s) based on your requirement.
Follow these steps to enable Azure AD SSO in the Azure portal.
3. If you want to set up Displayr manually, open a new web browser window and sign into your Displayr company site as an administrator and perform the following steps:
-4. Click on **Settings** then navigate to **Account**.
+4. Click on the **User** icon, then navigate to **Account settings**.
![Screenshot that shows the "Settings" icon and "Account" selected.](./media/displayr-tutorial/config01.png)
-5. Switch to **Settings** from the top menu and scroll down the page for clicking **Configure Single Sign On (SAML)**.
+5. Switch to **Settings** from the top menu and scroll down the page to click on **Configure Single Sign On (SAML)**.
![Screenshot that shows the "Settings" tab selected and the "Configure Single Sign On (S A M L)" action selected.](./media/displayr-tutorial/config02.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
g. Click **Save**.
-### Create an Azure AD test user
-
-In this section, you'll create a test user in the Azure portal called Britta Simon.
-
-1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
-1. Select **New user** at the top of the screen.
-1. In the **User** properties, follow these steps:
- 1. In the **Name** field, enter `Britta Simon`.
- 1. In the **User name** field, enter the username@companydomain.extension. For example, `BrittaSimon@contoso.com`.
- 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
- 1. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you'll enable Britta Simon to use Azure single sign-on by granting access to Displayr.
-
-1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
-1. In the applications list, select **Displayr**.
-1. In the app's overview page, find the **Manage** section and select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add User link](common/add-assign-user.png)
-
-1. In the **Users and groups** dialog, select **Britta Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
-1. In the **Add Assignment** dialog, click the **Assign** button.
-
-### Create Displayr test user
+### Restrict access to specific users
-To enable Azure AD users, sign in to Displayr, they must be provisioned into Displayr. In Displayr, provisioning is a manual task.
-
-**To provision a user account, perform the following steps:**
-
-1. Sign in to Displayr as an Administrator.
-
-2. Click on **Settings** then navigate to **Account**.
-
- ![Screenshot that shows the "Settings (cog)" icon with "Account" selected.](./media/displayr-tutorial/config01.png)
-
-3. Switch to **Settings** from the top menu and scroll down the page, until **Users** section then click on **New User**.
-
- ![Screenshot that shows the "Settings" tab with "Users" highlighted and the "New User" button selected.](./media/displayr-tutorial/config07.png)
-
-4. On the **New User** page, perform the following steps:
-
- ![Displayr Configuration](./media/displayr-tutorial/config06.png)
-
- a. In **Name** text box, enter the name of user like **Brittasimon**.
-
- b. In **Email** text box, enter the email of user like `Brittasimon@contoso.com`.
-
- c. Select your appropriate **Group membership**.
-
- d. Click **Save**.
+By default, all users in the tenant where you added the Displayr application can log in to Displayr by using SSO. If you want to restrict access to specific users or groups, see [Restrict your Azure AD app to a set of users in an Azure AD tenant](../develop/howto-restrict-your-app-to-a-set-of-users.md).
### Test SSO
-When you select the Displayr tile in the Access Panel, you should be automatically signed in to the Displayr for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+When you select the Displayr tile in the Access Panel, you should be automatically signed in to the Displayr company for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
## Additional Resources
active-directory Dome9arc Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/dome9arc-tutorial.md
Previously updated : 12/16/2020 Last updated : 05/14/2021
In this tutorial, you'll learn how to integrate Check Point CloudGuard Dome9 Arc with Azure Active Directory (Azure AD). When you integrate Check Point CloudGuard Dome9 Arc with Azure AD, you can: -- Control in Azure AD who has access to Check Point CloudGuard Dome9 Arc.-- Enable your users to be automatically signed-in to Check Point CloudGuard Dome9 Arc with their Azure AD accounts.-- Manage your accounts in one central location - the Azure portal.
+* Control in Azure AD who has access to Check Point CloudGuard Dome9 Arc.
+* Enable your users to be automatically signed-in to Check Point CloudGuard Dome9 Arc with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites To get started, you need the following items: -- An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).-- Check Point CloudGuard Dome9 Arc single sign-on (SSO) enabled subscription.
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Check Point CloudGuard Dome9 Arc single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment. -- Check Point CloudGuard Dome9 Arc supports **SP and IDP** initiated SSO
+* Check Point CloudGuard Dome9 Arc supports **SP and IDP** initiated SSO.
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Check Point CloudGuard Dome9 Arc from the gallery
+## Add Check Point CloudGuard Dome9 Arc from the gallery
To configure the integration of Check Point CloudGuard Dome9 Arc into Azure AD, you need to add Check Point CloudGuard Dome9 Arc from the gallery to your list of managed SaaS apps.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following step:
In the **Reply URL** text box, type a URL using the following pattern: `https://secure.dome9.com/sso/saml/<yourcompanyname>`
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
4. Click on the **Profile Settings** on the right top corner and then click **Account Settings**.
- ![Screenshot that shows the "Profile Settings" menu with "Account Settings" selected.](./media/dome9arc-tutorial/configure1.png)
+ ![Screenshot that shows the "Profile Settings" menu with "Account Settings" selected.](./media/dome9arc-tutorial/account.png)
5. Navigate to **SSO** and then click **ENABLE**.
- ![Screenshot that shows the "S S O" tab and "Enable" selected.](./media/dome9arc-tutorial/configure2.png)
+ ![Screenshot that shows the "S S O" tab and "Enable" selected.](./media/dome9arc-tutorial/settings.png)
6. In the SSO Configuration section, perform the following steps:
- ![Check Point CloudGuard Dome9 Arc Configuration](./media/dome9arc-tutorial/configure3.png)
+ ![Check Point CloudGuard Dome9 Arc Configuration](./media/dome9arc-tutorial/configuration.png)
a. Enter company name in the **Account ID** textbox. This value is to be used in the **Reply** and **Sign on** URL mentioned in **Basic SAML Configuration** section of Azure portal.
To enable Azure AD users to sign in to Check Point CloudGuard Dome9 Arc, they mu
2. Click on the **Users & Roles** and then click **Users**.
- ![Screenshot that shows "Users & Roles" with the "Users" action selected.](./media/dome9arc-tutorial/user1.png)
+ ![Screenshot that shows "Users & Roles" with the "Users" action selected.](./media/dome9arc-tutorial/user.png)
3. Click **ADD USER**.
- ![Screenshot that shows "Users & Roles" with the "ADD USER" button selected.](./media/dome9arc-tutorial/user2.png)
+ ![Screenshot that shows "Users & Roles" with the "ADD USER" button selected.](./media/dome9arc-tutorial/add-user.png)
4. In the **Create User** section, perform the following steps:
- ![Add Employee](./media/dome9arc-tutorial/user3.png)
+ ![Add Employee](./media/dome9arc-tutorial/create-user.png)
a. In the **Email** textbox, type the email of user like B.Simon@contoso.com.
To enable Azure AD users to sign in to Check Point CloudGuard Dome9 Arc, they mu
## Test SSO
-In this section, you test your Azure AD single sign-on configuration with following options.
+In this section, you test your Azure AD single sign-on configuration with following options.
#### SP initiated: -- Click on **Test this application** in Azure portal. This will redirect to Check Point CloudGuard Dome9 Arc Sign on URL where you can initiate the login flow.
+* Click on **Test this application** in Azure portal. This will redirect to Check Point CloudGuard Dome9 Arc Sign on URL where you can initiate the login flow.
-- Go to Check Point CloudGuard Dome9 Arc Sign-on URL directly and initiate the login flow from there.
+* Go to Check Point CloudGuard Dome9 Arc Sign-on URL directly and initiate the login flow from there.
#### IDP initiated: -- Click on **Test this application** in Azure portal and you should be automatically signed in to the Check Point CloudGuard Dome9 Arc for which you set up the SSO
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Check Point CloudGuard Dome9 Arc for which you set up the SSO.
You can also use Microsoft My Apps to test the application in any mode. When you click the Check Point CloudGuard Dome9 Arc tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Check Point CloudGuard Dome9 Arc for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md). ## Next steps
-Once you configure Check Point CloudGuard Dome9 Arc you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+Once you configure Check Point CloudGuard Dome9 Arc you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Edx For Business Saml Integration Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/edx-for-business-saml-integration-tutorial.md
Previously updated : 08/20/2020 Last updated : 05/13/2021
In this tutorial, you'll learn how to integrate edX for Business SAML Integratio
* Enable your users to be automatically signed-in to edX for Business SAML Integration with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment.
-* edX for Business SAML Integration supports **SP** initiated SSO
-* edX for Business SAML Integration supports **Just In Time** user provisioning
-
-* Once you configure edX for Business SAML Integration you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* edX for Business SAML Integration supports **SP** initiated SSO.
+* edX for Business SAML Integration supports **Just In Time** user provisioning.
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding edX for Business SAML Integration from the gallery
+## Add edX for Business SAML Integration from the gallery
To configure the integration of edX for Business SAML Integration into Azure AD, you need to add edX for Business SAML Integration from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **edX for Business SAML Integration** in the search box. 1. Select **edX for Business SAML Integration** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. - ## Configure and test Azure AD SSO for edX for Business SAML Integration Configure and test Azure AD SSO with edX for Business SAML Integration using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in edX for Business SAML Integration.
-To configure and test Azure AD SSO with edX for Business SAML Integration, complete the following building blocks:
+To configure and test Azure AD SSO with edX for Business SAML Integration, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with edX for Business SAML Integration, compl
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **edX for Business SAML Integration** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **edX for Business SAML Integration** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, perform the following step:
In the **Sign-on URL** text box, type a URL using the following pattern: `https://courses.edx.org/dashboard?tpa_hint=<INSTANCE_NAME>`
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer. ![The Certificate download link](common/copy-metadataurl.png)+ ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **edX for Business SAML Integration**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure edX for Business SAML Integration SSO
In this section, a user called Britta Simon is created in edX for Business SAML
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the edX for Business SAML Integration tile in the Access Panel, you should be automatically signed in to the edX for Business SAML Integration for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
--- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* Click on **Test this application** in Azure portal. This will redirect to edX for Business SAML Integration Sign-on URL where you can initiate the login flow.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+* Go to edX for Business SAML Integration Sign-on URL directly and initiate the login flow from there.
-- [Try edX for Business SAML Integration with Azure AD](https://aad.portal.azure.com/)
+* You can use Microsoft My Apps. When you click the edX for Business SAML Integration tile in the My Apps, this will redirect to edX for Business SAML Integration Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [How to protect edX for Business SAML Integration with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure edX for Business SAML Integration you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Evidence Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/evidence-tutorial.md
Previously updated : 04/24/2020 Last updated : 05/17/2021
In this tutorial, you'll learn how to integrate Evidence.com with Azure Active D
* Enable your users to be automatically signed-in to Evidence.com with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Evidence.com supports **SP** initiated SSO
-* Once you configure Evidence.com you can enforce session control, which protect exfiltration and infiltration of your organizationΓÇÖs sensitive data in real-time. Session control extend from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* Evidence.com supports **SP** initiated SSO.
-## Adding Evidence.com from the gallery
+## Add Evidence.com from the gallery
To configure the integration of Evidence.com into Azure AD, you need to add Evidence.com from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Evidence.com** in the search box. 1. Select **Evidence.com** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Evidence.com
+## Configure and test Azure AD SSO for Evidence.com
Configure and test Azure AD SSO with Evidence.com using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Evidence.com.
-To configure and test Azure AD SSO with Evidence.com, complete the following building blocks:
+To configure and test Azure AD SSO with Evidence.com, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Evidence.com, complete the following bui
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Evidence.com** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Evidence.com** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Evidence.com**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure Evidence.com SSO
-1. In a separate web browser window, sign into your Evidence.com tenant as an administrator and navigate to **Admin** Tab
+1. In a separate web browser window, sign into your Evidence.com tenant as an administrator and navigate to **Admin** Tab.
-2. Click on **Agency Single Sign On**
+2. Click on **Agency Single Sign On**.
-3. Select **SAML Based Single Sign On**
+3. Select **SAML Based Single Sign On**.
4. Copy the **Azure AD Identifier**, **Login URL** and **Logout URL** values shown in the Azure portal and to the corresponding fields in Evidence.com.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
### Create Evidence.com test user
-For Azure AD users to be able to sign in, they must be provisioned for access inside the Evidence.com application. This section describes how to create Azure AD user accounts inside Evidence.com
+For Azure AD users to be able to sign in, they must be provisioned for access inside the Evidence.com application. This section describes how to create Azure AD user accounts inside Evidence.com.
**To provision a user account in Evidence.com:**
For Azure AD users to be able to sign in, they must be provisioned for access in
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the Evidence.com tile in the Access Panel, you should be automatically signed in to the Evidence.com for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
--- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* Click on **Test this application** in Azure portal. This will redirect to Evidence.com Sign-on URL where you can initiate the login flow.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+* Go to Evidence.com Sign-on URL directly and initiate the login flow from there.
-- [Try Evidence.com with Azure AD](https://aad.portal.azure.com/)
+* You can use Microsoft My Apps. When you click the Evidence.com tile in the My Apps, this will redirect to Evidence.com Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [How to protect Evidence.com with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure Evidence.com you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Jostle Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/jostle-provisioning-tutorial.md
The Azure AD provisioning service allows you to scope who will be provisioned ba
This section guides you through the steps to configure the Azure AD provisioning service to create, update, and disable users and groups in Jostle app based on user and group assignments in Azure AD.
+> [!NOTE]
+> For more information on automatic user provisioning to Jostle, see [User-Provisioning-Azure-Integration](https://forum.jostle.us/hc/en-us/articles/360056368534-User-Provisioning-Azure-Integration).
+ ### To configure automatic user provisioning for Jostle in Azure AD: 1. Sign in to the [Azure portal](https://portal.azure.com). Select **Enterprise Applications**, then select **All applications**.
This section guides you through the steps to configure the Azure AD provisioning
![The Jostle link in the Applications list](common/all-applications.png)
-1. Select the **Provisioning** tab.
+1. Select the **Provisioning** tab and click **Get Started**.
![Provisioning tab](common/provisioning.png)
This section guides you through the steps to configure the Azure AD provisioning
![Token](common/provisioning-testconnection-tenanturltoken.png)
-1. In the **Notification Email** field, enter the email address of a person or group who should receive the provisioning error notifications. Select the **Send an email notification when a failure occurs** check box.
+1. In the **Notification Email** field, enter the email address of a person or group who should receive the provisioning error notifications. Select the **Send an email notification when a failure occurs** check box. But to be noted, Jostle will also send provisioning failure notifications, so this is optional.
![Notification Email](common/provisioning-notification-email.png) 1. Select **Save**.
-1. In the **Mappings** section, select **Synchronize Azure Active Directory Users to Jostle**.
+1. In the **Mappings** section, select **Provision Azure Active Directory Users to Jostle**.
1. Review the user attributes that are synchronized from Azure AD to Jostle in the **Attribute Mapping** section. The attributes selected as **Matching** properties are used to match the user accounts in Jostle for update operations. If you change the [matching target attribute](../app-provisioning/customize-application-attributes.md), you'll need to ensure that the Jostle API supports filtering users based on that attribute. Select **Save** to commit any changes.
This section guides you through the steps to configure the Azure AD provisioning
![Provisioning Status Toggled On](common/provisioning-toggle-on.png)
-1. Define the users or groups that you want to provision to Jostle by selecting the desired values in **Scope** in the **Settings** section.
+1. Define the users or groups that you want to provision to Jostle by selecting the desired values in **Scope** in the **Settings** section.For Jostle, the **Scope** should be set to "Sync only assigned users and groups".
![Provisioning Scope](common/provisioning-scope.png)
active-directory Mitel Connect Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/mitel-connect-tutorial.md
Previously updated : 07/31/2020 Last updated : 05/14/2021 # Tutorial: Azure Active Directory integration with Mitel MiCloud Connect or CloudLink Platform
In this tutorial, you will learn how to use the Mitel Connect app to integrate A
* You can control users' access to MiCloud Connect apps and to CloudLink apps in Azure AD by using their enterprise credentials. * You can enable users on your account to be automatically signed in to MiCloud Connect or CloudLink (single sign-on) by using their Azure AD accounts.
-For details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin the integration of Azure AD with Mitel MiCloud Connect or CloudLink Platform.
- ## Prerequisites To configure Azure AD integration with MiCloud Connect, you need the following items:
To configure Azure AD integration with MiCloud Connect, you need the following i
In this tutorial, you'll configure and test Azure AD single sign-on (SSO).
-* Mitel Connect supports **SP** initiated SSO
-* Once you configure Mitel Connect you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
-
-## Add Mitel Connect from the gallery
-
-To configure the integration of Mitel Connect into Azure AD, you need to add Mitel Connect from the gallery to your list of managed SaaS apps in the Azure portal.
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, select **Azure Active Directory**.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Select **Enterprise Applications**, and then select **All Applications**.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
+* Mitel Connect supports **SP** initiated SSO.
-3. Select **New application**.
+## Adding Mitel Connect from the gallery
- ![The New application button](common/add-new-app.png)
+To configure the integration of Mitel Connect into Azure AD, you need to add Mitel Connect from the gallery to your list of managed SaaS apps.
-4. Type **Mitel Connect** in the search field, select **Mitel Connect** from results panel, and then select **Add**.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Mitel Connect** in the search box.
+1. Select **Mitel Connect** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
- ![Mitel Connect in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO
In this section, you'll configure and test Azure AD SSO with MiCloud Connect or CloudLink Platform based on a test user named **_Britta Simon_**. For single sign-on to work, a link must be established between the user in Azure AD portal and the corresponding user on the Mitel platform. Refer to the following sections for information about configuring and testing Azure AD SSO with MiCloud Connect or CloudLink Platform. * Configure and test Azure AD SSO with MiCloud Connect
To configure and test Azure AD single sign-on with MiCloud Connect:
2. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon. 3. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on. 4. **[Create a Mitel MiCloud Connect test user](#create-a-mitel-micloud-connect-test-user)** - to have a counterpart of Britta Simon on your MiCloud Connect account that is linked to the Azure AD representation of the user.
-5. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+5. **[Test SSO](#test-sso)** - to verify whether the configuration works.
## Configure MiCloud Connect for SSO with Azure AD
In this section, you'll enable Azure AD single sign-on for MiCloud Connect in th
To configure MiCloud Connect with SSO for Azure AD, it is easiest to open the Azure portal and the Mitel Account portal side by side. You'll need to copy some information from the Azure portal to the Mitel Account portal and some from the Mitel Account portal to the Azure portal.
-1. To open the configuration page in the [Azure portal](https://portal.azure.com/):
+1. To open the configuration page in the Azure portal:
1. On the **Mitel Connect** application integration page, select **Single sign-on**.
- ![Configure single sign-on link](common/select-sso.png)
-
- 1. In the **Select a Single sign-on method** dialog box, select **SAML**.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
- The SAML-based sign-on page is displayed.
+ 1. In the **Select a Single sign-on method** dialog box, select **SAML**. The SAML-based sign-on page is displayed.
2. To open the configuration dialog box in the Mitel Account portal:
To configure MiCloud Connect with SSO for Azure AD, it is easiest to open the Az
### Create an Azure AD test user
-In this section, you'll create a test user named Britta Simon in the Azure portal.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- ![New user Button](common/new-user.png)
-
-3. In the User properties dialog box, do the following steps:
-
- ![The User dialog box](common/user-properties.png)
-
- 1. In the **Name** field, type **BrittaSimon**.
-
- 1. In the **User name** field, type brittasimon@\<yourcompanydomain\>.\<extension\>. For example, BrittaSimon@contoso.com.
-
- 1. Select the **Show password** check box, and then write down the value that is displayed in the **Password** box.
-
- 1. Select **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you'll enable Britta Simon to use Azure single sign-on by granting access to Mitel Connect.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Mitel Connect.
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Mitel Connect**.
-
- ![The Mitel Connect link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog box.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog box, select **Britta Simon** in the **Users** list, then choose **Select** at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion, select the appropriate role for the user from the list in the **Select Role** dialog box, and then choose **Select** at the bottom of the screen.
-
-7. In the **Add Assignment** dialog box, select **Assign**.
+1. In the applications list, select **Mitel Connect**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
### Create a Mitel MiCloud Connect test user
Create a user on your MiCloud Connect account with the following details:
> [!NOTE] > The userΓÇÖs MiCloud Connect username must be identical to the userΓÇÖs email address in Azure.
-### Test single sign-on
+### Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Mitel Connect Sign-on URL where you can initiate the login flow.
-In this section, you'll test your Azure AD single sign-on configuration using the Access Panel.
+* Go to Mitel Connect Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Mitel Connect tile in the My Apps, this will redirect to MiCloud Connect Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-When you select the Mitel Connect tile in the Access Panel, you should be automatically redirected to sign in to the MiCloud Connect application you configured as your default in the **Sign-on URL** field. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
## Configure and test Azure AD SSO with CloudLink Platform
This section describes how to enable Azure AD SSO for CloudLink platform in the
To configure CloudLink platform with single sign-on for Azure AD, it is recommended that you open the Azure portal and the CloudLink Accounts portal side by side as you will need to copy some information from the Azure portal to the CloudLink Accounts portal and vice versa.
-1. To open the configuration page in the [Azure portal](https://portal.azure.com/):
+1. To open the configuration page in the Azure portal:
1. On the **Mitel Connect** application integration page, select **Single sign-on**.-
- ![Configure single sign-on link](common/select-sso.png)
-
- 1. In the **Select a Single sign-on method** dialog box, select **SAML**.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
- The **SAML-based Sign-on** page opens, displaying the **Basic SAML Configuration** section.
+ 1. In the **Select a Single sign-on method** dialog box, select **SAML**. The **SAML-based Sign-on** page opens, displaying the **Basic SAML Configuration** section.
![Screenshot shows the SAML-based Sign-on page with Basic SAML Configuration.](./media/mitel-connect-tutorial/mitel-azure-saml-settings.png)
To configure CloudLink platform with single sign-on for Azure AD, it is recommen
### Create an Azure AD test user
-In this section, you'll create a test user named Britta Simon in the Azure portal.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties dialog box, do the following steps:
+In this section, you'll create a test user in the Azure portal called B.Simon.
- ![The User dialog box](common/user-properties.png)
-
- 1. In the **Name** field, type **BrittaSimon**.
-
- 1. In the **User name** field, type brittasimon@\<yourcompanydomain\>.\<extension\>. For example, BrittaSimon@contoso.com.
-
- 1. Select the **Show password** check box, and then write down the value that is displayed in the **Password** box.
-
- 1. Select **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you'll enable Britta Simon to use Azure single sign-on by granting access to Mitel Connect.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Mitel Connect.
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Mitel Connect**.
-
- ![The Mitel Connect link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog box.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog box, select **Britta Simon** in the **Users** list, then choose **Select** at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion, select the appropriate role for the user from the list in the **Select Role** dialog box, and then choose **Select** at the bottom of the screen.
-
-7. In the **Add Assignment** dialog box, select **Assign**.
+1. In the applications list, select **Mitel Connect**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
### Create a CloudLink test user
Create a user on your CloudLink Accounts portal with the following details:
> [!NOTE] > The user's CloudLink email address must be identical to the **User Principal Name** in the Azure portal.
-### Test single sign-on
+### Test SSO
-In this section, you'll test your Azure AD SSO configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you select the Mitel Connect tile in the Access Panel, you will be automatically redirected to sign in to the CloudLink application you configured as your default in the **Sign-on URL** field. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to CloudLink Sign-on URL where you can initiate the login flow.
-## Additional resources
+* Go to CloudLink Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Mitel Connect tile in the My Apps, this will redirect to CloudLink Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Mitel Connect you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Officespace Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/officespace-tutorial.md
Previously updated : 10/23/2019 Last updated : 05/14/2021
In this tutorial, you'll learn how to integrate OfficeSpace Software with Azure
* Enable your users to be automatically signed-in to OfficeSpace Software with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* OfficeSpace Software supports **SP** initiated SSO
--
-* OfficeSpace Software supports **Just In Time** user provisioning
+* OfficeSpace Software supports **SP** initiated SSO.
+* OfficeSpace Software supports **Just In Time** user provisioning.
-
-## Adding OfficeSpace Software from the gallery
+## Add OfficeSpace Software from the gallery
To configure the integration of OfficeSpace Software into Azure AD, you need to add OfficeSpace Software from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **OfficeSpace Software** in the search box. 1. Select **OfficeSpace Software** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on for OfficeSpace Software
+## Configure and test Azure AD SSO for OfficeSpace Software
Configure and test Azure AD SSO with OfficeSpace Software using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in OfficeSpace Software.
-To configure and test Azure AD SSO with OfficeSpace Software, complete the following building blocks:
+To configure and test Azure AD SSO with OfficeSpace Software, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with OfficeSpace Software, complete the follo
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **OfficeSpace Software** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **OfficeSpace Software** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, perform the following steps:
a. In the **Sign on URL** text box, type a URL using the following pattern: `https://<company name>.officespacesoftware.com/users/sign_in/saml`
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **OfficeSpace Software**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure OfficeSpace Software SSO
+## Configure OfficeSpace Software SSO
1. In a different web browser window, sign in to your OfficeSpace Software tenant as an administrator. 2. Go to **Settings** and click **Connectors**.
- ![Screenshot that shows the "Settings" drop-down with "Connectors" selected.](./media/officespace-tutorial/tutorial_officespace_002.png)
+ ![Screenshot that shows the "Settings" drop-down with "Connectors" selected.](./media/officespace-tutorial/settings.png)
3. Click **SAML Authentication**.
- ![Screenshot that shows the "Authentication" section with the "S A M L Authentication" action selected.](./media/officespace-tutorial/tutorial_officespace_003.png)
+ ![Screenshot that shows the "Authentication" section with the "S A M L Authentication" action selected.](./media/officespace-tutorial/authentication.png)
4. In the **SAML Authentication** section, perform the following steps:
- ![Configure Single Sign-On On App Side](./media/officespace-tutorial/tutorial_officespace_004.png)
+ ![Configure Single Sign-On On App Side](./media/officespace-tutorial/configuration.png)
a. In the **Logout provider url** textbox, paste the value of **Logout URL** which you have copied from Azure portal.
In this section, a user called B.Simon is created in OfficeSpace Software. Offic
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the OfficeSpace Software tile in the Access Panel, you should be automatically signed in to the OfficeSpace Software for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional resources
+* Click on **Test this application** in Azure portal. This will redirect to OfficeSpace Software Sign-on URL where you can initiate the login flow.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Go to OfficeSpace Software Sign-on URL directly and initiate the login flow from there.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the OfficeSpace Software tile in the My Apps, this will redirect to OfficeSpace Software Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try OfficeSpace Software with Azure AD](https://aad.portal.azure.com/)
+Once you configure OfficeSpace Software you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Purecloud By Genesys Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/purecloud-by-genesys-provisioning-tutorial.md
This tutorial describes the steps you need to perform in both PureCloud by Genes
The scenario outlined in this tutorial assumes that you already have the following prerequisites: * [An Azure AD tenant](../develop/quickstart-create-new-tenant.md)
-* A user account in Azure AD with [permission](../roles/permissions-reference.md) to configure provisioning (e.g. Application Administrator, Cloud Application administrator, Application Owner, or Global Administrator).
+* A user account in Azure AD with [permission](../roles/permissions-reference.md) to configure provisioning (for example, Application Administrator, Cloud Application administrator, Application Owner, or Global Administrator).
* A PureCloud [organization](https://help.mypurecloud.com/?p=81984). * A User with [permissions](https://help.mypurecloud.com/?p=24360) to create an Oauth Client.
The scenario outlined in this tutorial assumes that you already have the followi
## Step 3. Add PureCloud by Genesys from the Azure AD application gallery
-Add PureCloud by Genesys from the Azure AD application gallery to start managing provisioning to PureCloud by Genesys. If you have previously setup PureCloud by Genesys for SSO you can use the same application. However it is recommended that you create a separate app when testing out the integration initially. Learn more about adding an application from the gallery [here](../manage-apps/add-application-portal.md).
+Add PureCloud by Genesys from the Azure AD application gallery to start managing provisioning to PureCloud by Genesys. If you have previously setup PureCloud by Genesys for SSO, you can use the same application. However it is recommended that you create a separate app when testing out the integration initially. Learn more about adding an application from the gallery [here](../manage-apps/add-application-portal.md).
## Step 4. Define who will be in scope for provisioning
This section guides you through the steps to configure the Azure AD provisioning
![Screenshot of the Provisioning Mode dropdown list with the Automatic option called out.](common/provisioning-automatic.png)
-5. Under the **Admin Credentials** section, input your PureCloud by Genesys API URL and Oauth Token in the **Tenant URL** and **Secret Token** fields respectively. The API URL will be be structured as `{{API Url}}/api/v2/scim/v2`, using the API URL for your PureCloud region from the [PureCloud Developer Center](https://developer.mypurecloud.com/api/rest/https://docsupdatetracker.net/index.html). Click **Test Connection** to ensure Azure AD can connect to PureCloud by Genesys. If the connection fails, ensure your PureCloud by Genesys account has Admin permissions and try again.
+5. Under the **Admin Credentials** section, input your PureCloud by Genesys API URL and Oauth Token in the **Tenant URL** and **Secret Token** fields respectively. The API URL will be structured as `{{API Url}}/api/v2/scim/v2`, using the API URL for your PureCloud region from the [PureCloud Developer Center](https://developer.mypurecloud.com/api/rest/https://docsupdatetracker.net/index.html). Click **Test Connection** to ensure Azure AD can connect to PureCloud by Genesys. If the connection fails, ensure your PureCloud by Genesys account has Admin permissions and try again.
![Screenshot shows the Admin Credentials dialog box, where you can enter your Tenant U R L and Secret Token.](./media/purecloud-by-genesys-provisioning-tutorial/provisioning.png)
This section guides you through the steps to configure the Azure AD provisioning
9. Review the user attributes that are synchronized from Azure AD to PureCloud by Genesys in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the user accounts in PureCloud by Genesys for update operations. If you choose to change the [matching target attribute](../app-provisioning/customize-application-attributes.md), you will need to ensure that the PureCloud by Genesys API supports filtering users based on that attribute. Select the **Save** button to commit any changes.
- |Attribute|Type|
- |||
- |userName|String|
+ |Attribute|Type|Supported for filtering|
+ ||||
+ |userName|String|&check;|
|active|Boolean| |displayName|String| |emails[type eq "work"].value|String| |title|String| |phoneNumbers[type eq "mobile"].value|String| |phoneNumbers[type eq "work"].value|String|
+ |phoneNumbers[type eq "work2"].value|String|
+ |phoneNumberss[type eq "work3"].value|String|
+ |phoneNumbers[type eq "work4"].value|String|
+ |phoneNumbers[type eq "home"].value|String|
+ |phoneNumbers[type eq "microsoftteams"].value|String|
+ |roles|String|
|urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:department|String| |urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:manager|Reference| |urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:employeeNumber|String|
-
+ |urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:division|String|
+ |urn:ietf:params:scim:schemas:extension:genesys:purecloud:2.0:User:externalIds[authority eq ΓÇÿmicrosoftteamsΓÇÖ].value|String|
+ |urn:ietf:params:scim:schemas:extension:genesys:purecloud:2.0:User:externalIds[authority eq ΓÇÿringcentralΓÇÖ].value|String|
+ |urn:ietf:params:scim:schemas:extension:genesys:purecloud:2.0:User:externalIds[authority eq ΓÇÿzoomphone].value|String|
10. Under the **Mappings** section, select **Synchronize Azure Active Directory Groups to PureCloud by Genesys**. 11. Review the group attributes that are synchronized from Azure AD to PureCloud by Genesys in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the groups in PureCloud by Genesys for update operations. Select the **Save** button to commit any changes. PureCloud by Genesys does not support group creation or deletion and only supports updating of groups.
- |Attribute|Type|
- |||
- |displayName|String|
+ |Attribute|Type|Supported for filtering|
+ ||||
+ |displayName|String|&check;|
|externalId|String| |members|Reference|
Once you've configured provisioning, use the following resources to monitor your
## Change log
-09/10 - Added support for enterprise attribute "employeeNumber".
+* 09/10/2020 - Added support for extension enterprise attribute **employeeNumber**.
+* 05/18/2021 - Added support for core attributes **phoneNumbers[type eq "work2"]**, **phoneNumbers[type eq "work3"]**, **phoneNumbers[type eq "work4"]**, **phoneNumbers[type eq "home"]**, **phoneNumbers[type eq "microsoftteams"]** and roles. And also added support for custom extension attributes **urn:ietf:params:scim:schemas:extension:genesys:purecloud:2.0:User:externalIds[authority eq ΓÇÿmicrosoftteamsΓÇÖ]**, **urn:ietf:params:scim:schemas:extension:genesys:purecloud:2.0:User:externalIds[authority eq ΓÇÿzoomphone]** and **urn:ietf:params:scim:schemas:extension:genesys:purecloud:2.0:User:externalIds[authority eq ΓÇÿringcentralΓÇÖ]**.
-## Additional resources
+## More resources
* [Managing user account provisioning for Enterprise Apps](../app-provisioning/configure-automatic-user-provisioning-portal.md) * [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
active-directory Rackspacesso Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/rackspacesso-tutorial.md
Previously updated : 04/15/2019 Last updated : 05/14/2021 # Tutorial: Azure Active Directory integration with Rackspace SSO
-In this tutorial, you learn how to integrate Rackspace SSO with Azure Active Directory (Azure AD).
-Integrating Rackspace SSO with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Rackspace SSO with Azure Active Directory (Azure AD). When you integrate Rackspace SSO with Azure AD, you can:
-* You can control in Azure AD who has access to Rackspace SSO.
-* You can enable your users to be automatically signed-in to Rackspace SSO (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Rackspace SSO.
+* Enable your users to be automatically signed-in to Rackspace SSO with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites To configure Azure AD integration with Rackspace SSO, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/)
-* Rackspace SSO single sign-on enabled subscription
+* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/).
+* Rackspace SSO single sign-on enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Rackspace SSO supports **SP** initiated SSO
-
-## Adding Rackspace SSO from the gallery
-
-To configure the integration of Rackspace SSO into Azure AD, you need to add Rackspace SSO from the gallery to your list of managed SaaS apps.
-
-**To add Rackspace SSO from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
+* Rackspace SSO supports **SP** initiated SSO.
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
- ![The Enterprise applications blade](common/enterprise-applications.png)
+## Add Rackspace SSO from the gallery
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Rackspace SSO**, select **Rackspace SSO** from result panel then click **Add** button to add the application.
+To configure the integration of Rackspace SSO into Azure AD, you need to add Rackspace SSO from the gallery to your list of managed SaaS apps.
- ![Rackspace SSO in the results list](common/search-new-app.png)
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Rackspace SSO** in the search box.
+1. Select **Rackspace SSO** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for Rackspace SSO
In this section, you configure and test Azure AD single sign-on with Rackspace SSO based on a test user called **Britta Simon**. When using single sign-on with Rackspace, the Rackspace users will be automatically created the first time they sign in to the Rackspace portal.
-To configure and test Azure AD single sign-on with Rackspace SSO, you need to complete the following building blocks:
+To configure and test Azure AD single sign-on with Rackspace SSO, you need to perform the following steps:
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
2. **[Configure Rackspace SSO Single Sign-On](#configure-rackspace-sso-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-1. **[Set up Attribute Mapping in the Rackspace Control Panel](#set-up-attribute-mapping-in-the-rackspace-control-panel)** - to assign Rackspace roles to Azure AD users.
-1. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
-
-### Configure Azure AD single sign-on
-
-In this section, you enable Azure AD single sign-on in the Azure portal.
-
-To configure Azure AD single sign-on with Rackspace SSO, perform the following steps:
+ 1. **[Set up Attribute Mapping in the Rackspace Control Panel](#set-up-attribute-mapping-in-the-rackspace-control-panel)** - to assign Rackspace roles to Azure AD users.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-1. In the [Azure portal](https://portal.azure.com/), on the **Rackspace SSO** application integration page, select **Single sign-on**.
+## Configure Azure AD SSO
- ![Configure single sign-on link](common/select-sso.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+1. In the Azure portal, on the **Rackspace SSO** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, Upload the **Service Provider metadata file** which you can download from the [URL](https://login.rackspace.com/federate/sp.xml) and perform the following steps:
To configure Azure AD single sign-on with Rackspace SSO, perform the following s
c. Once the metadata file is successfully uploaded, the necessary urls get auto populated automatically.
- d. In the **Sign-on URL** text box, type a URL:
+ d. In the **Sign-on URL** text box, type the URL:
`https://login.rackspace.com/federate/`
- ![Rackspace SSO Domain and URLs single sign-on information](common/sp-signonurl.png)
- 5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer. ![The Certificate download link](common/metadataxml.png) This file will be uploaded to Rackspace to populate required Identity Federation configuration settings.
-### Configure Rackspace SSO Single Sign-On
-
-To configure single sign-on on **Rackspace SSO** side:
-
-1. See the documentation at [Add an Identity Provider to the Control Panel](https://developer.rackspace.com/docs/rackspace-federation/gettingstarted/add-idp-cp/)
-1. It will lead you through the steps to:
- 1. Create a new Identity Provider
- 1. Specify an email domain that users will use to identify your company when signing in.
- 1. Upload the **Federation Metadata XML** previously downloaded from the Azure control panel.
-
-This will correctly configure the basic SSO settings needed for Azure and Rackspace to connect.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
+In this section, you'll create a test user in the Azure portal called B.Simon.
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type `brittasimon@yourcompanydomain.extension`. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Rackspace SSO.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Rackspace SSO**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Rackspace SSO**.
-
- ![The Rackspace SSO link in the Applications list](common/all-applications.png)
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Rackspace SSO.
-3. In the menu on the left, select **Users and groups**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Rackspace SSO**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![The "Users and groups" link](common/users-groups-blade.png)
+## Configure Rackspace SSO Single Sign-On
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
+To configure single sign-on on **Rackspace SSO** side:
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
+1. See the documentation at [Add an Identity Provider to the Control Panel](https://developer.rackspace.com/docs/rackspace-federation/gettingstarted/add-idp-cp/)
+1. It will lead you through the steps to:
+ 1. Create a new Identity Provider
+ 1. Specify an email domain that users will use to identify your company when signing in.
+ 1. Upload the **Federation Metadata XML** previously downloaded from the Azure control panel.
-7. In the **Add Assignment** dialog click the **Assign** button.
+This will correctly configure the basic SSO settings needed for Azure and Rackspace to connect.
### Set up Attribute Mapping in the Rackspace control panel
mapping:
See the Rackspace [Attribute Mapping Basics documentation](https://developer.rackspace.com/docs/rackspace-federation/appendix/map/) for more examples.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Rackspace SSO tile in the Access Panel, you should be automatically signed in to the Rackspace SSO for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Rackspace SSO Sign-on URL where you can initiate the login flow.
-You can also use the **Validate** button in the **Rackspace SSO** Single sign-on settings:
+* Go to Rackspace SSO Sign-on URL directly and initiate the login flow from there.
- ![SSO Validate Button](common/sso-validate-sign-on.png)
+* You can use Microsoft My Apps. When you click the Rackspace SSO tile in the My Apps, this will redirect to Rackspace SSO Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-## Additional Resources
+You can also use the **Validate** button in the **Rackspace SSO** Single sign-on settings:
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+ ![SSO Validate Button](common/sso-validate-sign-on.png)
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Rackspace SSO you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sumologic Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/sumologic-tutorial.md
Previously updated : 01/03/2020 Last updated : 05/14/2021
In this tutorial, you'll learn how to integrate SumoLogic with Azure Active Dire
* Enable your users to be automatically signed-in to SumoLogic with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* SumoLogic supports **IDP** initiated SSO
+* SumoLogic supports **IDP** initiated SSO.
-## Adding SumoLogic from the gallery
+## Add SumoLogic from the gallery
To configure the integration of SumoLogic into Azure AD, you need to add SumoLogic from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **SumoLogic** in the search box. 1. Select **SumoLogic** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for SumoLogic
+## Configure and test Azure AD SSO for SumoLogic
Configure and test Azure AD SSO with SumoLogic using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in SumoLogic.
-To configure and test Azure AD SSO with SumoLogic, complete the following building blocks:
+To configure and test Azure AD SSO with SumoLogic, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure SumoLogic SSO](#configure-sumologic-sso)** - to configure the single sign-on settings on application side.
- * **[Create SumoLogic test user](#create-sumologic-test-user)** - to have a counterpart of B.Simon in SumoLogic that is linked to the Azure AD representation of user.
+ 1. **[Create SumoLogic test user](#create-sumologic-test-user)** - to have a counterpart of B.Simon in SumoLogic that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **SumoLogic** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **SumoLogic** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Set up single sign-on with SAML** page, enter the values for the following fields:
-
- a. In the **Identifier** text box, type a URL using the following pattern:
-
- - `https://service.sumologic.com`
- - `https://<tenantname>.us2.sumologic.com`
- - `https://<tenantname>.us4.sumologic.com`
- - `https://<tenantname>.eu.sumologic.com`
- - `https://<tenantname>.jp.sumologic.com`
- - `https://<tenantname>.de.sumologic.com`
- - `https://<tenantname>.ca.sumologic.com`
-
- b. In the **Reply URL** text box, type a URL using the following pattern:
-
- - `https://service.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.us2.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.us4.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.eu.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.jp.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.de.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.ca.sumologic.com/sumo/saml/consume/<tenantname>`
- - `https://service.au.sumologic.com/sumo/saml/consume/<tenantname>`
+1. On the **Set up single sign-on with SAML** page, perform the following steps:
+
+ a. In the **Identifier** text box, type a URL using one of the following patterns:
+
+ | Identifier URL |
+ ||
+ | `https://service.sumologic.com`|
+ | `https://<tenantname>.us2.sumologic.com`|
+ | `https://<tenantname>.us4.sumologic.com`|
+ | `https://<tenantname>.eu.sumologic.com`|
+ | `https://<tenantname>.jp.sumologic.com`|
+ | `https://<tenantname>.de.sumologic.com`|
+ | `https://<tenantname>.ca.sumologic.com`|
+ |
+
+ b. In the **Reply URL** text box, type a URL using one of the following patterns:
+
+ | Reply URL |
+ ||
+ | `https://service.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.us2.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.us4.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.eu.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.jp.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.de.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.ca.sumologic.com/sumo/saml/consume/<tenantname>` |
+ | `https://service.au.sumologic.com/sumo/saml/consume/<tenantname>` |
+ |
> [!NOTE] > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [SumoLogic Client support team](https://www.sumologic.com/contact-us/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **SumoLogic**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure SumoLogic SSO 1. In a different web browser window, sign in to your SumoLogic company site as an administrator.
-1. Go to **Manage \> Security**.
+1. Go to **Manage** -> **Security**.
- ![Manage](./media/sumologic-tutorial/ic778556.png "Manage")
+ ![Manage](./media/sumologic-tutorial/security.png "Manage")
1. Click **SAML**.
- ![Global security settings](./media/sumologic-tutorial/ic778557.png "Global security settings")
+ ![Global security settings](./media/sumologic-tutorial/settings.png "Global security settings")
1. From the **Select a configuration or create a new one** list, select **Azure AD**, and then click **Configure**.
- ![Screenshot shows Configure SAML 2.0 where you can select Azure A D.](./media/sumologic-tutorial/ic778558.png "Configure SAML 2.0")
+ ![Screenshot shows Configure SAML 2.0 where you can select Azure A D.](./media/sumologic-tutorial/configure.png "Configure SAML 2.0")
1. On the **Configure SAML 2.0** dialog, perform the following steps:
- ![Screenshot shows the Configure SAML 2.0 dialog box where you can enter the values described.](./media/sumologic-tutorial/ic778559.png "Configure SAML 2.0")
+ ![Screenshot shows the Configure SAML 2.0 dialog box where you can enter the values described.](./media/sumologic-tutorial/configuration.png "Configure SAML 2.0")
a. In the **Configuration Name** textbox, type **Azure AD**.
In order to enable Azure AD users to sign in to SumoLogic, they must be provisio
1. Go to **Manage \> Users**.
- ![Screenshot shows Users selected from the Manage menu.](./media/sumologic-tutorial/ic778561.png "Users")
+ ![Screenshot shows Users selected from the Manage menu.](./media/sumologic-tutorial/user.png "Users")
1. Click **Add**.
- ![Screenshot shows the Add button for Users.](./media/sumologic-tutorial/ic778562.png "Users")
+ ![Screenshot shows the Add button for Users.](./media/sumologic-tutorial/add-user.png "Users")
1. On the **New User** dialog, perform the following steps:
- ![New User](./media/sumologic-tutorial/ic778563.png "New User")
+ ![New User](./media/sumologic-tutorial/new-account.png "New User")
a. Type the related information of the Azure AD account you want to provision into the **First Name**, **Last Name**, and **Email** textboxes.
In order to enable Azure AD users to sign in to SumoLogic, they must be provisio
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the SumoLogic tile in the Access Panel, you should be automatically signed in to the SumoLogic for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the SumoLogic for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the SumoLogic tile in the My Apps, you should be automatically signed in to the SumoLogic for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try SumoLogic with Azure AD](https://aad.portal.azure.com/)
+Once you configure SumoLogic you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Talentsoft Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/talentsoft-tutorial.md
Previously updated : 05/12/2020 Last updated : 05/17/2021
In this tutorial, you'll learn how to integrate Talentsoft with Azure Active Dir
* Enable your users to be automatically signed-in to Talentsoft with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Talentsoft supports **SP and IDP** initiated SSO
+* Talentsoft supports **SP and IDP** initiated SSO.
-## Adding Talentsoft from the gallery
+## Add Talentsoft from the gallery
To configure the integration of Talentsoft into Azure AD, you need to add Talentsoft from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Talentsoft** in the search box. 1. Select **Talentsoft** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Talentsoft
+## Configure and test Azure AD SSO for Talentsoft
Configure and test Azure AD SSO with Talentsoft using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Talentsoft.
-To configure and test Azure AD SSO with Talentsoft, complete the following building blocks:
+To configure and test Azure AD SSO with Talentsoft, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure Talentsoft SSO](#configure-talentsoft-sso)** - to configure the single sign-on settings on application side.
- * **[Create Talentsoft test user](#create-talentsoft-test-user)** - to have a counterpart of B.Simon in Talentsoft that is linked to the Azure AD representation of user.
+ 1. **[Create Talentsoft test user](#create-talentsoft-test-user)** - to have a counterpart of B.Simon in Talentsoft that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Talentsoft** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Talentsoft** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Setup single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
a. In the **Identifier** text box, type a URL using the following pattern: `https://<fedserver>/<tenant>/trust`
Follow these steps to enable Azure AD SSO in the Azure portal.
> [!NOTE] > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Talentsoft Client support team](mailto:advancedservices@talentsoft.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
-1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+1. On the **Setup single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
![The Certificate download link](common/certificatebase64.png)
-1. On the **Set up Talentsoft** section, copy the appropriate URL(s) based on your requirement.
+1. On the **Setup Talentsoft** section, copy the appropriate URL(s) based on your requirement.
![Copy configuration URLs](common/copy-configuration-urls.png)
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Talentsoft**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure Talentsoft SSO
In this section, you create a user called B.Simon in Talentsoft. Work with [Tale
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Talentsoft Sign on URL where you can initiate the login flow.
-When you click the Talentsoft tile in the Access Panel, you should be automatically signed in to the Talentsoft for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Go to Talentsoft Sign-on URL directly and initiate the login flow from there.
-## Additional resources
+#### IDP initiated:
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Talentsoft for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Talentsoft tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Talentsoft for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try Talentsoft with Azure AD](https://aad.portal.azure.com/)
+Once you configure Talentsoft you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Trend Micro Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/trend-micro-tutorial.md
Previously updated : 04/21/2020 Last updated : 05/14/2021
In this tutorial, you'll learn how to integrate Trend Micro Web Security (TMWS)
* Enable your users to be automatically signed in to TMWS with their Azure AD accounts. * Manage your accounts in one central location: the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [Single sign-on to applications in Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need:
To get started, you need:
In this tutorial, you'll configure and test Azure AD SSO in a test environment.
-* TMWS supports SP-initiated SSO.
-* After you configure TMWS, you can enforce session control, which protects exfiltration and infiltration of your organization's sensitive data in real time. Session control extends from Conditional Access. To learn how to enforce session control by using Microsoft Cloud App Security, see [Onboard and deploy Conditional Access App Control for any app](/cloud-app-security/proxy-deployment-any-app).
+* TMWS supports **SP** initiated SSO.
## Add TMWS from the gallery To configure the integration of TMWS into Azure AD, you need to add TMWS from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) with either a work or school account or a personal Microsoft account.
+1. Sign in to the Azure portal with either a work or school account or a personal Microsoft account.
1. In the left pane, select the **Azure Active Directory** service. 1. Select **Enterprise applications** and then select **All applications**. 1. To add a new application, select **New application**.
You'll complete these basic steps to configure and test Azure AD SSO with TMWS:
Complete these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Trend Micro Web Security (TMWS)** application integration page, in the **Manage** section, select **single sign-on**.
+1. In the Azure portal, on the **Trend Micro Web Security (TMWS)** application integration page, in the **Manage** section, select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**. 1. On the **Set up Single Sign-On with SAML** page, select the pen button for **Basic SAML Configuration** to edit the settings:
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise applications**, and then select **All applications**. 1. In the applications list, select **Trend Micro Web Security (TMWS)**. 1. In the app's overview page, in the **Manage** section, select **Users and groups**:-
- ![Select Users and groups](common/users-groups-blade.png)
- 1. Select **Add user**, and then select **Users and groups** in the **Add Assignment** dialog box.-
- ![Select Add user](common/add-assign-user.png)
- 1. In the **Users and groups** dialog box, select **B.Simon** in the **Users** list, and then click the **Select** button at the bottom of the screen. 1. If you expect a role value in the SAML assertion, in the **Select Role** dialog box, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog box, select **Assign**.
For details, see [Traffic Forwarding Using PAC Files](https://docs.trendmicro.co
1. In the Azure AD sign-in window, enter your Azure AD account credentials. You should now be signed in to TMWS.
-## Additional resources
--- [Tutorials on how to integrate SaaS apps with Azure Active Directory](./tutorial-list.md)--- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)--- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)--- [Try Trend Micro Web Security with Azure AD](https://aad.portal.azure.com/)--- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [How to protect Trend Micro Web Security with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure TMWS you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Enable Your Tenant Verifiable Credentials https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/verifiable-credentials/enable-your-tenant-verifiable-credentials.md
Previously updated : 04/01/2021 Last updated : 05/18/2021
aks Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/availability-zones.md
Volumes that use Azure managed disks are currently not zone-redundant resources.
Kubernetes is aware of Azure availability zones since version 1.12. You can deploy a PersistentVolumeClaim object referencing an Azure Managed Disk in a multi-zone AKS cluster and [Kubernetes will take care of scheduling](https://kubernetes.io/docs/setup/best-practices/multiple-zones/#storage-access-for-zones) any pod that claims this PVC in the correct availability zone.
+### Azure Resource Manager templates and availability zones
+
+When *creating* an AKS cluster, if you explicitly define a [null value in a template][arm-template-null] with syntax such as `"availabilityZones": null`, the Resource Manager template treats the property as if it doesn't exist, which means your cluster wonΓÇÖt have availability zones enabled. Also, if you create a cluster with a Resource Manager template that omits the availability zones property, availability zones are disabled.
+
+You can't update settings for availability zones on an existing cluster, so the behavior is different when updating am AKS cluster with Resource Manager templates. If you explicitly set a null value in your template for availability zones and *update* your cluster, there are no changes made to your cluster for availability zones. However, if you omit the availability zones property with syntax such as `"availabilityZones": []`, the deployment attempts to disable availability zones on your existing AKS cluster and **fails**.
+ ## Overview of availability zones for AKS clusters Availability zones are a high-availability offering that protects your applications and data from datacenter failures. Zones are unique physical locations within an Azure region. Each zone is made up of one or more datacenters equipped with independent power, cooling, and networking. To ensure resiliency, there's always more than one zone in all zone enabled regions. The physical separation of availability zones within a region protects applications and data from datacenter failures.
This article detailed how to create an AKS cluster that uses availability zones.
[az-aks-nodepool-add]: /cli/azure/aks/nodepool#az_aks_nodepool_add [az-aks-get-credentials]: /cli/azure/aks#az_aks_get_credentials [vmss-zone-balancing]: ../virtual-machine-scale-sets/virtual-machine-scale-sets-use-availability-zones.md#zone-balancing
+[arm-template-null]: ../azure-resource-manager/templates/template-expressions.md#null-values
<!-- LINKS - external --> [kubectl-describe]: https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#describe
aks Enable Host Encryption https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/enable-host-encryption.md
Title: Enable host-based encryption on Azure Kubernetes Service (AKS)
description: Learn how to configure a host-based encryption in an Azure Kubernetes Service (AKS) cluster Previously updated : 03/03/2021 - Last updated : 04/26/2021 ++
This feature can only be set at cluster creation or node pool creation time.
### Prerequisites -- The Azure CLI version 2.23.0 or later+
+- Ensure you have the CLI extension v2.23 or higher version installed.
+ ### Limitations
api-management Api Management Howto Use Managed Service Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/api-management-howto-use-managed-service-identity.md
In this template, you will deploy:
To run the deployment automatically, click the following button:
-[![Deploy to Azure](../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2F101-api-management-key-vault-create%2Fazuredeploy.json)
+[![Deploy to Azure](../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.apimanagement%2Fapi-management-key-vault-create%2Fazuredeploy.json)
### Authenticate to the back end by using a user-assigned identity
app-service Configure Custom Container https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-custom-container.md
In your *docker-compose.yml* file, map the `volumes` option to `${WEBAPP_STORAGE
```yaml wordpress:
- image: wordpress:latest
+ image: <image name:tag>
volumes: - ${WEBAPP_STORAGE_HOME}/site/wwwroot:/var/www/html - ${WEBAPP_STORAGE_HOME}/phpmyadmin:/var/www/phpmyadmin
app-service Create Ilb Ase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/create-ilb-ase.md
The SCM site name takes you to the Kudu console, called the **Advanced portal**,
Internet-based CI systems, such as GitHub and Azure DevOps, will still work with an ILB ASE if the build agent is internet accessible and on the same network as ILB ASE. So in case of Azure DevOps, if the build agent is created on the same VNET as ILB ASE (different subnet is fine), it will be able to pull code from Azure DevOps git and deploy to ILB ASE. If you don't want to create your own build agent, you need to use a CI system that uses a pull model, such as Dropbox.
-The publishing endpoints for apps in an ILB ASE use the domain that the ILB ASE was created with. This domain appears in the app's publishing profile and in the app's portal blade (**Overview** > **Essentials** and also **Properties**). If you have an ILB ASE with the domain suffix *&lt;ASE name&gt;.appserviceenvironment.net*, and an app named *mytest*, use *mytest.&lt;ASE name&gt;.appserviceenvironment.net* for FTP and *mytest.scm.contoso.net* for web deployment.
+The publishing endpoints for apps in an ILB ASE use the domain that the ILB ASE was created with. This domain appears in the app's publishing profile and in the app's portal blade (**Overview** > **Essentials** and also **Properties**). If you have an ILB ASE with the domain suffix *&lt;ASE name&gt;.appserviceenvironment.net*, and an app named *mytest*, use *mytest.&lt;ASE name&gt;.appserviceenvironment.net* for FTP and *mytest.scm.contoso.net* for MSDeploy deployment.
## Configure an ILB ASE with a WAF device ##
app-service Scenario Secure App Authentication App Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/scenario-secure-app-authentication-app-service.md
At the bottom of the **Add an identity provider** page, click **Add** to enable
You now have an app that's secured by the App Service authentication and authorization.
+> [!NOTE]
+> To allow accounts from other tenants, change the 'Issuer URL' to 'https://login.microsoftonline.com/common/v2.0' by editing your 'Identity Provider' from the 'Authentication' blade.
+>
+ ## Verify limited access to the web app When you enabled the App Service authentication/authorization module, an app registration was created in your Azure AD tenant. The app registration has the same display name as your web app. To check the settings, select **Azure Active Directory** from the portal menu, and select **App registrations**. Select the app registration that was created. In the overview, verify that **Supported account types** is set to **My organization only**.
app-service Template Deploy Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/scripts/template-deploy-private-endpoint.md
This template creates a private endpoint for an Azure web app.
Here's how to deploy the Azure Resource Manager template to Azure:
-1. To sign in to Azure and open the template, select this link: [Deploy to Azure](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2F101-private-endpoint-webapp%2Fazuredeploy.json). The template creates the virtual network, the web app, the private endpoint, and the private DNS zone.
+1. To sign in to Azure and open the template, select this link: [Deploy to Azure](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.web%2Fprivate-endpoint-webapp%2Fazuredeploy.json). The template creates the virtual network, the web app, the private endpoint, and the private DNS zone.
2. Select or create your resource group. 3. Enter the name of your web app, Azure App Service plan, and private endpoint. 5. Read the statement about terms and conditions. If you agree, select **I agree to the terms and conditions stated above** > **Purchase**. The deployment can take several minutes to finish.
automation Automation Solution Vm Management Enable https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/automation-solution-vm-management-enable.md
Title: Enable Azure Automation Start/Stop VMs during off-hours
description: This article tells how to enable the Start/Stop VMs during off-hours feature for your Azure VMs. Previously updated : 04/01/2020 Last updated : 05/18/2021
Perform the steps in this topic in sequence to enable the Start/Stop VMs during
>To use this feature with classic VMs, you need a Classic Run As account, which is not created by default. See [Create a Classic Run As account](automation-create-standalone-account.md#create-a-classic-run-as-account). >
-## Create resources for the feature
+## Enable and configure
1. Sign in to the Azure [portal](https://portal.azure.com). 2. Search for and select **Automation Accounts**.
-3. On the Automation Accounts page, select your Automation account from the list.
-4. From the Automation account, select **Start/Stop VM** under **Related Resources**. From here, you can click **Learn more about and enable the solution**. If you already have the feature deployed, you can click **Manage the solution** and finding it in the list.
+3. On the **Automation Accounts** page, select your Automation account from the list.
+4. From the Automation account, select **Start/Stop VM** under **Related Resources**. From here, you can click **Learn more about and enable the solution**. If you already have the feature deployed, you can click **Manage the solution** and find it in the list.
![Enable from automation account](./media/automation-solution-vm-management/enable-from-automation-account.png)
Perform the steps in this topic in sequence to enable the Start/Stop VMs during
![Azure portal](media/automation-solution-vm-management/azure-portal-01.png)
-## Configure the feature
-
-With the resource created, the Add Solution page appears. You're prompted to configure the feature before you can import it into your Automation subscription. See [Configure Start/Stop VMs during off-hours](automation-solution-vm-management-config.md).
+ With the resource created, the Add Solution page appears. You're prompted to configure the feature before you can import it into your Automation account.
![VM management Add Solution page](media/automation-solution-vm-management/azure-portal-add-solution-01.png)
-## Select a Log Analytics workspace
-
-1. On the Add Solution page, select **Workspace**. Select a Log Analytics workspace that's linked to the Azure subscription used by the Automation account.
-
-2. If you don't have a workspace, select **Create New Workspace**. On the Log Analytics workspace page, perform the following steps:
-
- - Specify a name for the new Log Analytics workspace, such as **ContosoLAWorkspace**.
- - Select a **Subscription** to link to by selecting from the dropdown list, if the default selected is not appropriate.
- - For **Resource Group**, you can create a new resource group or select an existing one.
- - Select a **Location**.
- - Select a **Pricing tier**. Choose the **Per GB (Standalone)** option. Azure Monitor logs have updated [pricing](https://azure.microsoft.com/pricing/details/log-analytics/) and the Per GB tier is the only option.
+6. On the **Add Solution** page, select **Workspace**. Select an existing Log Analytics workspace from the list. If there isn't an Automation account in the same supported region as the workspace, you can create a new Automation account in the next step.
> [!NOTE] > When enabling features, only certain regions are supported for linking a Log Analytics workspace and an Automation account. For a list of the supported mapping pairs, see [Region mapping for Automation account and Log Analytics workspace](how-to/region-mappings.md).
-3. After providing the required information on the Log Analytics workspace page, click **Create**. You can track its progress under **Notifications** from the menu, which returns you to the Add Solution page when done.
-
-## Add Automation account
-
-Access the Add Solution page again and select **Automation account**. You can select an existing Automation account that not already linked to a Log Analytics workspace. If you're creating a new Log Analytics workspace, you can create a new Automation account to associate with it. Select an existing Automation account or click **Create an Automation account**, and on the Add Automation account page, provide the the name of the Automation account in the **Name** field.
-
-All other options are automatically populated, based on the Log Analytics workspace selected. You can't modify these options. An Azure Run As account is the default authentication method for the runbooks included with the feature.
-
-After you click **OK**, the configuration options are validated and the Automation account is created. You can track its progress under **Notifications** from the menu.
+7. On the **Add Solution page** if there isn't an Automation account available in the supported region as the workspace, select **Automation account**. You can create a new Automation account to associate with it by selecting **Create an Automation account**, and on the **Add Automation account** page, provide the the name of the Automation account in the **Name** field.
-## Define feature parameters
+ All other options are automatically populated, based on the Log Analytics workspace selected. You can't modify these options. An Azure Run As account is the default authentication method for the runbooks included with the feature.
+
+ After you click **OK**, the configuration options are validated and the Automation account is created. You can track its progress under **Notifications** from the menu.
-1. On the Add Solution page, select **Configuration**. The Parameters page appears.
+8. On the Add Solution page, select **Configure parameters**. The **Parameters** page appears.
![Parameters page for solution](media/automation-solution-vm-management/azure-portal-add-solution-02.png)
-2. Specify a value for the **Target ResourceGroup Names** field. The field defines group names that contain VMs for the feature to manage. You can enter more than one name and separate the names using commas (values are not case-sensitive). Using a wildcard is supported if you want to target VMs in all resource groups in the subscription. The values are stored in the `External_Start_ResourceGroupNames` and `External_Stop_ResourceGroupNames` variables.
+9. Specify a value for the **Target ResourceGroup Names** field. The field defines group names that contain VMs for the feature to manage. You can enter more than one name and separate the names using commas (values are not case-sensitive). Using a wildcard is supported if you want to target VMs in all resource groups in the subscription. The values are stored in the `External_Start_ResourceGroupNames` and `External_Stop_ResourceGroupNames` variables.
> [!IMPORTANT] > The default value for **Target ResourceGroup Names** is a **&ast;**. This setting targets all VMs in a subscription. If you don't want the feature to target all the VMs in your subscription, you must provide a list of resource group names before selecting a schedule.
-3. Specify a value for the **VM Exclude List (string)** field. This value is the name of one or more virtual machines from the target resource group. You can enter more than one name and separate the names using commas (values are not case-sensitive). Using a wildcard is supported. This value is stored in the `External_ExcludeVMNames` variable.
+10. Specify a value for the **VM Exclude List (string)** field. This value is the name of one or more virtual machines from the target resource group. You can enter more than one name and separate the names using commas (values are not case-sensitive). Using a wildcard is supported. This value is stored in the `External_ExcludeVMNames` variable.
-4. Use the **Schedule** field to select a schedule for VM management by the feature. Select a start date and time for your schedule, to create a recurring daily schedule starting at the chosen time. Selecting a different region is not available. To configure the schedule to your specific time zone after configuring the feature, see [Modify the startup and shutdown schedules](automation-solution-vm-management-config.md#modify-the-startup-and-shutdown-schedules).
+11. Use the **Schedule** field to select a schedule for VM management by the feature. Select a start date and time for your schedule to create a recurring daily schedule starting at the chosen time. Selecting a different region is not available. To configure the schedule to your specific time zone after configuring the feature, see [Modify the startup and shutdown schedules](automation-solution-vm-management-config.md#modify-the-startup-and-shutdown-schedules).
-5. To receive email notifications from an [action group](../azure-monitor/alerts/action-groups.md), accept the default value of **Yes** in the **Email notifications** field, and provide a valid email address. If you select **No** but decide at a later date that you want to receive email notifications, you can update the action group that is created with valid email addresses separated by commas.
-
-6. Enable the following alert rules:
+12. To receive email notifications from an [action group](../azure-monitor/alerts/action-groups.md), accept the default value of **Yes** in the **Email notifications** field, and provide a valid email address. If you select **No** but decide at a later date that you want to receive email notifications, you can update the action group that is created with valid email addresses separated by commas. The following alert rules are created in the subscription:
- `AutoStop_VM_Child` - `Scheduled_StartStop_Parent` - `Sequenced_StartStop_Parent`
-## Create alerts
-
-Start/Stop VMs during off-hours doesn't include a predefined set of alerts. Review [Create log alerts with Azure Monitor](../azure-monitor/alerts/alerts-log.md) to learn how to create job failed alerts to support your DevOps or operational processes and procedures.
-
-## Deploy the feature
+13. After you have configured the initial settings required for the feature, click **OK** to close the **Parameters** page.
-1. After you have configured the initial settings required for the feature, click **OK** to close the Parameters page.
-
-2. Click **Create**. After all settings are validated, the feature deploys to your subscription. This process can take several seconds to finish, and you can track its progress under **Notifications** from the menu.
+14. Click **Create**. After all settings are validated, the feature deploys to your subscription. This process can take several seconds to finish, and you can track its progress under **Notifications** from the menu.
> [!NOTE] > If you have an Azure Cloud Solution Provider (Azure CSP) subscription, after deployment is complete, in your Automation account, go to **Variables** under **Shared Resources** and set the [External_EnableClassicVMs](automation-solution-vm-management.md#variables) variable to **False**. This stops the solution from looking for Classic VM resources.
+## Create alerts
+
+Start/Stop VMs during off-hours doesn't include a predefined set of Automation job alerts. Review [Forward job data to Azure Monitor Logs](automation-manage-send-joblogs-log-analytics.md#azure-monitor-log-records) to learn about log data forwarded from the Automation account related to the runbook job results and how to create job failed alerts to support your DevOps or operational processes and procedures.
+ ## Next steps * To set up the feature, see [Configure Stop/Start VMs during off-hours](automation-solution-vm-management-config.md).
-* To resolve feature errors, see [Troubleshoot Start/Stop VMs during off-hours issues](troubleshoot/start-stop-vm.md).
+* To resolve feature errors, see [Troubleshoot Start/Stop VMs during off-hours issues](troubleshoot/start-stop-vm.md).
automation Automation Solution Vm Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/automation-solution-vm-management.md
Title: Azure Automation Start/Stop VMs during off-hours overview
description: This article describes the Start/Stop VMs during off-hours feature, which starts or stops VMs on a schedule and proactively monitor them from Azure Monitor Logs. Previously updated : 02/04/2020 Last updated : 05/18/2021
The Start/Stop VMs during off-hours feature start or stops enabled Azure VMs. It starts or stops machines on user-defined schedules, provides insights through Azure Monitor logs, and sends optional emails by using [action groups](../azure-monitor/alerts/action-groups.md). The feature can be enabled on both Azure Resource Manager and classic VMs for most scenarios.
+> [!NOTE]
+> Before you install this version, we would like you to know about the [next version](https://github.com/microsoft/startstopv2-deployments), which is in preview right now. This new version (V2) offers all the same functionality as this one, but is designed to take advantage of newer technology in Azure. It adds some of the commonly requested features from customers, such as multi-subscription support from a single Start/Stop instance.
+ This feature uses [Start-AzVm](/powershell/module/az.compute/start-azvm) cmdlet to start VMs. It uses [Stop-AzVM](/powershell/module/az.compute/stop-azvm) for stopping VMs. > [!NOTE]
The following are limitations with the current feature:
- It manages VMs in any region, but can only be used in the same subscription as your Azure Automation account. - It is available in Azure and Azure Government for any region that supports a Log Analytics workspace, an Azure Automation account, and alerts. Azure Government regions currently don't support email functionality.
-> [!NOTE]
-> Before you install this version, we would like you to know about the [next version](https://github.com/microsoft/startstopv2-deployments), which is in preview right now. This new version (V2) offers all the same functionality as this one, but is designed to take advantage of newer technology in Azure. It adds some of the commonly requested features from customers, such as multi-subscription support from a single Start/Stop instance.
- ## Prerequisites - The runbooks for the Start/Stop VMs during off hours feature work with an [Azure Run As account](./automation-security-overview.md#run-as-accounts). The Run As account is the preferred authentication method because it uses certificate authentication instead of a password that might expire or change frequently. -- An [Azure Monitor Log Analytics workspace](../azure-monitor/logs/design-logs-deployment.md) that stores the runbook job logs and job stream results in a workspace to query and analyze. The Automation account can be linked to a new or existing Log Analytics workspace, and both resources need to be in the same resource group.
+- An [Azure Monitor Log Analytics workspace](../azure-monitor/logs/design-logs-deployment.md) that stores the runbook job logs and job stream results in a workspace to query and analyze. The Automation account and Log Analytics workspace need to be in the same subscription and supported region. The workspace needs to already exist, you cannot create a new workspace during deployment of this feature.
We recommend that you use a separate Automation account for working with VMs enabled for the Start/Stop VMs during off-hours feature. Azure module versions are frequently upgraded, and their parameters might change. The feature isn't upgraded on the same cadence and it might not work with newer versions of the cmdlets that it uses. Before importing the updated modules into your production Automation account(s), we recommend you import them into a test Automation account to verify there aren't any compatibility issues.
To enable VMs for the Start/Stop VMs during off-hours feature using an existing
### Permissions for new Automation account and new Log Analytics workspace
-You can enable VMs for the Start/Stop VMs during off-hours feature using a new Automation account and Log Analytics workspace. In this case, you need the permissions defined in the preceding section as well as the permissions defined in this section. You also require the following roles:
+You can enable VMs for the Start/Stop VMs during off-hours feature using a new Automation account and Log Analytics workspace. In this case, you need the permissions defined in the previous section and the permissions defined in this section. You also require the following roles:
- Co-Administrator on subscription. This role is required to create the Classic Run As account if you are going to manage classic VMs. [Classic Run As accounts](automation-create-standalone-account.md#create-a-classic-run-as-account) are no longer created by default. - Membership in the [Azure AD](../active-directory/roles/permissions-reference.md) Application Developer role. For more information on configuring Run As Accounts, see [Permissions to configure Run As accounts](automation-security-overview.md#permissions).
You can enable VMs for the Start/Stop VMs during off-hours feature using a new A
## Components
-The Start/Stop VMs during off-hours feature include preconfigured runbooks, schedules, and integration with Azure Monitor logs. You can use these elements to tailor the startup and shutdown of your VMs to suit your business needs.
+The Start/Stop VMs during off-hours feature include preconfigured runbooks, schedules, and integration with Azure Monitor Logs. You can use these elements to tailor the startup and shutdown of your VMs to suit your business needs.
### Runbooks
azure-arc Agent Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/agent-overview.md
Title: Overview of the Connected Machine agent description: This article provides a detailed overview of the Azure Arc enabled servers agent available, which supports monitoring virtual machines hosted in hybrid environments. Previously updated : 05/14/2021 Last updated : 05/18/2021
The following versions of the Windows and Linux operating system are officially
> [!WARNING] > The Linux hostname or Windows computer name cannot use one of the reserved words or trademarks in the name, otherwise attempting to register the connected machine with Azure will fail. See [Resolve reserved resource name errors](../../azure-resource-manager/templates/error-reserved-resource-name.md) for a list of the reserved words.
+> [!NOTE]
+> While Arc enabled servers supports Amazon Linux, the following do not support this distro:
+> * Agents used by Azure Monitor (that is, the Log Analytics and Dependency agent)
+> * Azure Automation Update Management
+> * VM insights
+ ### Software requirements * NET Framework 4.6 or later is required. [Download the .NET Framework](/dotnet/framework/install/guide-for-developers).
azure-arc Manage Agent https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/manage-agent.md
Title: Managing the Azure Arc enabled servers agent description: This article describes the different management tasks that you will typically perform during the lifecycle of the Azure Arc enabled servers Connected Machine agent. Previously updated : 04/27/2021 Last updated : 05/18/2021
To disconnect with your elevated logged-on credentials (interactive), run the fo
## Remove the agent
-Perform one of the following methods to uninstall the Windows or Linux Connected Machine agent from the machine. Removing the agent does not unregister the machine with Arc enabled servers or remove the Azure VM extensions installed. Unregister the machine and remove the installed VM extensions separately when you no longer need to manage the machine in Azure, and those steps should be completed prior to uninstalling the agent.
+Perform one of the following methods to uninstall the Windows or Linux Connected Machine agent from the machine. Removing the agent does not unregister the machine with Arc enabled servers or remove the Azure VM extensions installed. For servers or machines you no longer want to manage with Azure Arc enabled servers, it is necessary to follow these steps to successfully stop managing it:
+
+1. Remove VM extensions installed from the [Azure portal](manage-vm-extensions-portal.md#uninstall-extension), using the [Azure CLI](manage-vm-extensions-cli.md#remove-an-installed-extension), or using [Azure PowerShell](manage-vm-extensions-powershell.md#remove-an-installed-extension) that you don't want to remain on the machine.
+1. Unregister the machine by running `azcmagent disconnect` to delete the Arc enabled servers resource in Azure. If that fails, you can delete the resource manually in Azure. Otherwise, if the resource was deleted in Azure, you'll need to run `azcmagent disconnect --force-local-only` on the server to remove the local configuration.
### Windows agent
azure-cache-for-redis Cache Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-configure.md
# How to configure Azure Cache for Redis
-This topic describes the configurations available for your Azure Cache for Redis instances. This topic also covers the default Redis server configuration for Azure Cache for Redis instances.
+
+This article describes the configurations available for your Azure Cache for Redis instances. This article also covers the default Redis server configuration for Azure Cache for Redis instances.
> [!NOTE] > For more information on configuring and using premium cache features, see [How to configure persistence](cache-how-to-premium-persistence.md), [How to configure clustering](cache-how-to-premium-clustering.md), and [How to configure Virtual Network support](cache-how-to-premium-vnet.md).
This topic describes the configurations available for your Azure Cache for Redis
> ## Configure Azure Cache for Redis settings+ [!INCLUDE [redis-cache-create](../../includes/redis-cache-browse.md)]
-Azure Cache for Redis settings are viewed and configured on the **Azure Cache for Redis** blade using the **Resource Menu**.
+Azure Cache for Redis settings are viewed and configured on the **Azure Cache for Redis** on the left using the **Resource Menu**.
![Azure Cache for Redis Settings](./media/cache-configure/redis-cache-settings.png)
You can view and configure the following settings using the **Resource Menu**.
* [Tags](#tags) * [Diagnose and solve problems](#diagnose-and-solve-problems) * [Settings](#settings)
- * [Access keys](#access-keys)
- * [Advanced settings](#advanced-settings)
- * [Azure Cache for Redis Advisor](#azure-cache-for-redis-advisor)
- * [Scale](#scale)
- * [Cluster size](#cluster-size)
- * [Data persistence](#redis-data-persistence)
- * [Schedule updates](#schedule-updates)
- * [Geo-replication](#geo-replication)
- * [Virtual Network](#virtual-network)
- * [Firewall](#firewall)
- * [Properties](#properties)
- * [Locks](#locks)
- * [Automation script](#automation-script)
+ * [Access keys](#access-keys)
+ * [Advanced settings](#advanced-settings)
+ * [Azure Cache for Redis Advisor](#azure-cache-for-redis-advisor)
+ * [Scale](#scale)
+ * [Cluster size](#cluster-size)
+ * [Data persistence](#redis-data-persistence)
+ * [Schedule updates](#schedule-updates)
+ * [Geo-replication](#geo-replication)
+ * [Virtual Network](#virtual-network)
+ * [Firewall](#firewall)
+ * [Properties](#properties)
+ * [Locks](#locks)
+ * [Automation script](#automation-script)
* Administration
- * [Import data](#importexport)
- * [Export data](#importexport)
- * [Reboot](#reboot)
+ * [Import data](#importexport)
+ * [Export data](#importexport)
+ * [Reboot](#reboot)
* [Monitoring](#monitoring)
- * [Redis metrics](#redis-metrics)
- * [Alert rules](#alert-rules)
- * [Diagnostics](#diagnostics)
+ * [Redis metrics](#redis-metrics)
+ * [Alert rules](#alert-rules)
+ * [Diagnostics](#diagnostics)
* Support & troubleshooting settings
- * [Resource health](#resource-health)
- * [New support request](#new-support-request)
-
+ * [Resource health](#resource-health)
+ * [New support request](#new-support-request)
## Overview
You can view and configure the following settings using the **Resource Menu**.
### Activity log
-Click **Activity log** to view actions performed on your cache. You can also use filtering to expand this view to include other resources. For more information on working with audit logs, see [Audit operations with Resource Manager](../azure-resource-manager/management/view-activity-logs.md). For more information on monitoring Azure Cache for Redis events, see [Operations and alerts](cache-how-to-monitor.md#operations-and-alerts).
+Select **Activity log** to view actions done to your cache. You can also use filtering to expand this view to include other resources. For more information on working with audit logs, see [Audit operations with Resource Manager](../azure-resource-manager/management/view-activity-logs.md). For more information on monitoring Azure Cache for Redis events, see [Operations and alerts](cache-how-to-monitor.md#operations-and-alerts).
### Access control (IAM)
The **Access control (IAM)** section provides support for Azure role-based acces
The **Tags** section helps you organize your resources. For more information, see [Using tags to organize your Azure resources](../azure-resource-manager/management/tag-resources.md). - ### Diagnose and solve problems
-Click **Diagnose and solve problems** to be provided with common issues and strategies for resolving them.
--
+Select **Diagnose and solve problems** to be provided with common issues and strategies for resolving them.
## Settings+ The **Settings** section allows you to access and configure the following settings for your cache. * [Access keys](#access-keys)
The **Settings** section allows you to access and configure the following settin
* [Locks](#locks) * [Automation script](#automation-script) -- ### Access keys
-Click **Access keys** to view or regenerate the access keys for your cache. These keys are used by the clients connecting to your cache.
+
+Select **Access keys** to view or regenerate the access keys for your cache. These keys are used by the clients connecting to your cache.
![Azure Cache for Redis Access Keys](./media/cache-configure/redis-cache-manage-keys.png) ### Advanced settings
-The following settings are configured on the **Advanced settings** blade.
+
+The following settings are configured on the **Advanced settings** on the left.
* [Access Ports](#access-ports) * [Memory policies](#memory-policies) * [Keyspace notifications (advanced settings)](#keyspace-notifications-advanced-settings) #### Access Ports
-By default, non-TLS/SSL access is disabled for new caches. To enable the non-TLS port, click **No** for **Allow access only via SSL** on the **Advanced settings** blade and then click **Save**.
+
+By default, non-TLS/SSL access is disabled for new caches. To enable the non-TLS port, Select **No** for **Allow access only via SSL** on the **Advanced settings** on the left and then Select **Save**.
> [!NOTE] > TLS access to Azure Cache for Redis supports TLS 1.0, 1.1 and 1.2 currently, but versions 1.0 and 1.1 are being retired soon. Please read our [Remove TLS 1.0 and 1.1 page](cache-remove-tls-10-11.md) for more details.
By default, non-TLS/SSL access is disabled for new caches. To enable the non-TLS
![Azure Cache for Redis Access Ports](./media/cache-configure/redis-cache-access-ports.png) <a name="maxmemory-policy-and-maxmemory-reserved"></a>+ #### Memory policies
-The **Maxmemory policy**, **maxmemory-reserved**, and **maxfragmentationmemory-reserved** settings on the **Advanced settings** blade configure the memory policies for the cache.
+
+The **Maxmemory policy**, **maxmemory-reserved**, and **maxfragmentationmemory-reserved** settings on the **Advanced settings** on the left configure the memory policies for the cache.
![Azure Cache for Redis Maxmemory Policy](./media/cache-configure/redis-cache-maxmemory-policy.png) **Maxmemory policy** configures the eviction policy for the cache and allows you to choose from the following eviction policies:
-* `volatile-lru` - This is the default eviction policy.
+* `volatile-lru` - The default eviction policy.
* `allkeys-lru` * `volatile-random` * `allkeys-random`
The **Maxmemory policy**, **maxmemory-reserved**, and **maxfragmentationmemory-r
For more information about `maxmemory` policies, see [Eviction policies](https://redis.io/topics/lru-cache#eviction-policies).
-The **maxmemory-reserved** setting configures the amount of memory, in MB per instance in a cluster, that is reserved for non-cache operations, such as replication during failover. Setting this value allows you to have a more consistent Redis server experience when your load varies. This value should be set higher for workloads that are write heavy. When memory is reserved for such operations, it is unavailable for storage of cached data.
+The **maxmemory-reserved** setting configures the amount of memory, in MB per instance in a cluster, that is reserved for non-cache operations, such as replication during failover. Setting this value allows you to have a more consistent Redis server experience when your load varies. This value should be set higher for workloads that write large amounts of data. When memory is reserved for such operations, it's unavailable for storage of cached data.
-The **maxfragmentationmemory-reserved** setting configures the amount of memory, in MB per instance in a cluster, that is reserved to accommodate for memory fragmentation. Setting this value allows you to have a more consistent Redis server experience when the cache is full or close to full and the fragmentation ratio is high. When memory is reserved for such operations, it is unavailable for storage of cached data.
+The **maxfragmentationmemory-reserved** setting configures the amount of memory, in MB per instance in a cluster, that is reserved to accommodate for memory fragmentation. When you set this value, you to have a more consistent Redis server experience when the cache is full or close to full and the fragmentation ratio is high. When memory is reserved for such operations, it's unavailable for storage of cached data.
-One thing to consider when choosing a new memory reservation value (**maxmemory-reserved** or **maxfragmentationmemory-reserved**) is how this change might affect a cache that is already running with large amounts of data in it. For instance, if you have a 53 GB cache with 49 GB of data, then change the reservation value to 8 GB, this change will drop the max available memory for the system down to 45 GB. If either your current `used_memory` or your `used_memory_rss` values are higher than the new limit of 45 GB, then the system will have to evict data until both `used_memory` and `used_memory_rss` are below 45 GB. Eviction can increase server load and memory fragmentation. For more information on cache metrics such as `used_memory` and `used_memory_rss`, see [Available metrics and reporting intervals](cache-how-to-monitor.md#available-metrics-and-reporting-intervals).
+One thing to consider when choosing a new memory reservation value (**maxmemory-reserved** or **maxfragmentationmemory-reserved**) is how this change might affect a cache that is already running with large amounts of data in it. For instance, if you have a 53-GB cache with 49 GB of data, then change the reservation value to 8 GB, this change will drop the max available memory for the system down to 45 GB. If either your current `used_memory` or your `used_memory_rss` values are higher than the new limit of 45 GB, then the system will have to evict data until both `used_memory` and `used_memory_rss` are below 45 GB. Eviction can increase server load and memory fragmentation. For more information on cache metrics such as `used_memory` and `used_memory_rss`, see [Available metrics and reporting intervals](cache-how-to-monitor.md#available-metrics-and-reporting-intervals).
> [!IMPORTANT] > The **maxmemory-reserved** and **maxfragmentationmemory-reserved** settings are available only for Standard and Premium caches.
->
+>
> The `noeviction` eviction policy is the only memory policy that's available for an Enterprise tier cache. > #### Keyspace notifications (advanced settings)
-Redis keyspace notifications are configured on the **Advanced settings** blade. Keyspace notifications allow clients to receive notifications when certain events occur.
+
+Redis keyspace notifications are configured on the **Advanced settings** on the left. Keyspace notifications allow clients to receive notifications when certain events occur.
![Azure Cache for Redis Advanced Settings](./media/cache-configure/redis-cache-advanced-settings.png)
Redis keyspace notifications are configured on the **Advanced settings** blade.
For more information, see [Redis Keyspace Notifications](https://redis.io/topics/notifications). For sample code, see the [KeySpaceNotifications.cs](https://github.com/rustd/RedisSamples/blob/master/HelloWorld/KeySpaceNotifications.cs) file in the [Hello world](https://github.com/rustd/RedisSamples/tree/master/HelloWorld) sample. - <a name="recommendations"></a>+ ## Azure Cache for Redis Advisor
-The **Azure Cache for Redis Advisor** blade displays recommendations for your cache. During normal operations, no recommendations are displayed.
+
+The **Azure Cache for Redis Advisor** on the left displays recommendations for your cache. During normal operations, no recommendations are displayed.
![Screenshot that shows where the recommendations are displayed.](./media/cache-configure/redis-cache-no-recommendations.png)
-If any conditions occur during the operations of your cache such as high memory usage, network bandwidth, or server load, an alert is displayed on the **Azure Cache for Redis** blade.
+If any conditions occur during the operations of your cache such as high memory usage, network bandwidth, or server load, an alert is displayed on the **Azure Cache for Redis** on the left.
![Screenshot that shows where alerts are displayed in the Azure Cache for Redis section.](./media/cache-configure/redis-cache-recommendations-alert.png)
-Further information can be found on the **Recommendations** blade.
+Further information can be found on the **Recommendations** on the left.
![Recommendations](./media/cache-configure/redis-cache-recommendations.png)
-You can monitor these metrics on the [Monitoring charts](cache-how-to-monitor.md#monitoring-charts) and [Usage charts](cache-how-to-monitor.md#usage-charts) sections of the **Azure Cache for Redis** blade.
+You can monitor these metrics on the [Monitoring charts](cache-how-to-monitor.md#monitoring-charts) and [Usage charts](cache-how-to-monitor.md#usage-charts) sections of the **Azure Cache for Redis** on the left.
Each pricing tier has different limits for client connections, memory, and bandwidth. If your cache approaches maximum capacity for these metrics over a sustained period of time, a recommendation is created. For more information about the metrics and limits reviewed by the **Recommendations** tool, see the following table: | Azure Cache for Redis metric | More information | | | | | Network bandwidth usage |[Cache performance - available bandwidth](cache-planning-faq.md#azure-cache-for-redis-performance) |
-| Connected clients |[Default Redis server configuration - maxclients](#maxclients) |
+| Connected clients |[Default Redis server configuration - max clients](#maxclients) |
| Server load |[Usage charts - Redis Server Load](cache-how-to-monitor.md#usage-charts) | | Memory usage |[Cache performance - size](cache-planning-faq.md#azure-cache-for-redis-performance) |
-To upgrade your cache, click **Upgrade now** to change the pricing tier and [scale](#scale) your cache. For more information on choosing a pricing tier, see [Choosing the right tier](cache-overview.md#choosing-the-right-tier)
-
+To upgrade your cache, select **Upgrade now** to change the pricing tier and [scale](#scale) your cache. For more information on choosing a pricing tier, see [Choosing the right tier](cache-overview.md#choosing-the-right-tier)
### Scale
-Click **Scale** to view or change the pricing tier for your cache. For more information on scaling, see [How to Scale Azure Cache for Redis](cache-how-to-scale.md).
+
+Select **Scale** to view or change the pricing tier for your cache. For more information on scaling, see [How to Scale Azure Cache for Redis](cache-how-to-scale.md).
![Azure Cache for Redis pricing tier](./media/cache-configure/pricing-tier.png) <a name="cluster-size"></a> ### Redis Cluster Size
-Click **Cluster Size** to change the cluster size for a running premium cache with clustering enabled.
+
+Select **Cluster Size** to change the cluster size for a running premium cache with clustering enabled.
![Cluster size](./media/cache-configure/redis-cache-redis-cluster-size.png)
-To change the cluster size, use the slider or type a number between 1 and 10 in the **Shard count** text box and click **OK** to save.
+To change the cluster size, use the slider or type a number between 1 and 10 in the **Shard count** text box. Then, select **OK** to save.
> [!IMPORTANT] > Redis clustering is only available for Premium caches. For more information, see [How to configure clustering for a Premium Azure Cache for Redis](cache-how-to-premium-clustering.md). > > - ### Redis data persistence
-Click **Data persistence** to enable, disable, or configure data persistence for your premium cache. Azure Cache for Redis offers Redis persistence using either RDB persistence or AOF persistence.
-For more information, see [How to configure persistence for a Premium Azure Cache for Redis](cache-how-to-premium-persistence.md).
+Select **Data persistence** to enable, disable, or configure data persistence for your premium cache. Azure Cache for Redis offers Redis persistence using either RDB persistence or AOF persistence.
+For more information, see [How to configure persistence for a Premium Azure Cache for Redis](cache-how-to-premium-persistence.md).
> [!IMPORTANT] > Redis data persistence is only available for Premium caches.
For more information, see [How to configure persistence for a Premium Azure Cach
> ### Schedule updates
-The **Schedule updates** blade allows you to designate a maintenance window for Redis server updates for your cache.
+
+The Schedule updates on the left allow you to choose a maintenance window for Redis server updates for your cache.
> [!IMPORTANT] > The maintenance window applies only to Redis server updates, and not to any Azure updates or updates to the operating system of the VMs that host the cache.
The **Schedule updates** blade allows you to designate a maintenance window for
![Schedule updates](./media/cache-configure/redis-schedule-updates.png)
-To specify a maintenance window, check the desired days and specify the maintenance window start hour for each day, and click **OK**. The maintenance window time is in UTC.
+To specify a maintenance window, check the days you want. Then, specify the maintenance window start hour for each day, and select **OK**. The maintenance window time is in UTC.
For more information and instructions, see [Azure Cache for Redis administration - Schedule updates](cache-administration.md#schedule-updates) ### Geo-replication
-The **Geo-replication** blade provides a mechanism for linking two Premium tier Azure Cache for Redis instances. One cache is designated as the primary linked cache, and the other as the secondary linked cache. The secondary linked cache becomes read-only, and data written to the primary cache is replicated to the secondary linked cache. This functionality can be used to replicate a cache across Azure regions.
+**Geo-replication**, on the left, provides a mechanism for linking two Premium tier Azure Cache for Redis instances. One cache is named as the primary linked cache, and the other as the secondary linked cache. The secondary linked cache becomes read-only, and data written to the primary cache is replicated to the secondary linked cache. This functionality can be used to replicate a cache across Azure regions.
> [!IMPORTANT] > **Geo-replication** is only available for Premium tier caches. For more information and instructions, see [How to configure Geo-replication for Azure Cache for Redis](cache-how-to-geo-replication.md).
The **Geo-replication** blade provides a mechanism for linking two Premium tier
> ### Virtual Network+ The **Virtual Network** section allows you to configure the virtual network settings for your cache. For information on creating a premium cache with VNET support and updating its settings, see [How to configure Virtual Network Support for a Premium Azure Cache for Redis](cache-how-to-premium-vnet.md). > [!IMPORTANT]
The **Virtual Network** section allows you to configure the virtual network sett
Firewall rules configuration is available for all Azure Cache for Redis tiers.
-Click **Firewall** to view and configure firewall rules for cache.
+Select **Firewall** to view and configure firewall rules for cache.
![Firewall](./media/cache-configure/redis-firewall-rules.png)
You can specify firewall rules with a start and end IP address range. When firew
> ### Properties
-Click **Properties** to view information about your cache, including the cache endpoint and ports.
+
+Select **Properties** to view information about your cache, including the cache endpoint and ports.
![Azure Cache for Redis Properties](./media/cache-configure/redis-cache-properties.png) ### Locks+ The **Locks** section allows you to lock a subscription, resource group, or resource to prevent other users in your organization from accidentally deleting or modifying critical resources. For more information, see [Lock resources with Azure Resource Manager](../azure-resource-manager/management/lock-resources.md). ### Automation script
-Click **Automation script** to build and export a template of your deployed resources for future deployments. For more information about working with templates, see [Deploy resources with Azure Resource Manager templates](../azure-resource-manager/templates/deploy-powershell.md).
+Select **Automation script** to build and export a template of your deployed resources for future deployments. For more information about working with templates, see [Deploy resources with Azure Resource Manager templates](../azure-resource-manager/templates/deploy-powershell.md).
## Administration settings+ The settings in the **Administration** section allow you to perform the following administrative tasks for your cache. ![Administration](./media/cache-configure/redis-cache-administration.png)
The settings in the **Administration** section allow you to perform the followin
* [Export data](#importexport) * [Reboot](#reboot) - ### Import/Export+ Import/Export is an Azure Cache for Redis data management operation, which allows you to import and export data in the cache by importing and exporting an Azure Cache for Redis Database (RDB) snapshot from a premium cache to a page blob in an Azure Storage Account. Import/Export enables you to migrate between different Azure Cache for Redis instances or populate the cache with data before use. Import can be used to bring Redis compatible RDB files from any Redis server running in any cloud or environment, including Redis running on Linux, Windows, or any cloud provider such as Amazon Web Services and others. Importing data is an easy way to create a cache with pre-populated data. During the import process, Azure Cache for Redis loads the RDB files from Azure storage into memory, and then inserts the keys into the cache.
Export allows you to export the data stored in Azure Cache for Redis to Redis co
> ### Reboot
-The **Reboot** blade allows you to reboot the nodes of your cache. This reboot capability enables you to test your application for resiliency if there is a failure of a cache node.
+
+The **Reboot** on the left allows you to reboot the nodes of your cache. This reboot capability enables you to test your application for resiliency if there is a failure of a cache node.
![Reboot](./media/cache-configure/redis-cache-reboot.png)
If you have a premium cache with clustering enabled, you can select which shards
![Screenshot that shows where to select which shards of the cache to reboot.](./media/cache-configure/redis-cache-reboot-cluster.png)
-To reboot one or more nodes of your cache, select the desired nodes and click **Reboot**. If you have a premium cache with clustering enabled, select the shard(s) to reboot and then click **Reboot**. After a few minutes, the selected node(s) reboot, and are back online a few minutes later.
+To reboot one or more nodes of your cache, select the desired nodes and select **Reboot**. If you have a premium cache with clustering enabled, select the shard(s) to reboot and then select **Reboot**. After a few minutes, the selected node(s) reboot, and are back online a few minutes later.
> [!IMPORTANT] > Reboot is now available for all pricing tiers. For more information and instructions, see [Azure Cache for Redis administration - Reboot](cache-administration.md#reboot). > > - ## Monitoring The **Monitoring** section allows you to configure diagnostics and monitoring for your Azure Cache for Redis.
For more information on Azure Cache for Redis monitoring and diagnostics, see [H
* [Diagnostics](#diagnostics) ### Redis metrics
-Click **Redis metrics** to [view metrics](cache-how-to-monitor.md#view-cache-metrics) for your cache.
+
+Select **Redis metrics** to [view metrics](cache-how-to-monitor.md#view-cache-metrics) for your cache.
### Alert rules
-Click **Alert rules** to configure alerts based on Azure Cache for Redis metrics. For more information, see [Alerts](cache-how-to-monitor.md#alerts).
+Select **Alert rules** to configure alerts based on Azure Cache for Redis metrics. For more information, see [Alerts](cache-how-to-monitor.md#alerts).
### Diagnostics
-By default, cache metrics in Azure Monitor are [stored for 30 days](../azure-monitor/essentials/data-platform-metrics.md) and then deleted. To persist your cache metrics for longer than 30 days, click **Diagnostics** to [configure the storage account](cache-how-to-monitor.md#export-cache-metrics) used to store cache diagnostics.
+By default, cache metrics in Azure Monitor are [stored for 30 days](../azure-monitor/essentials/data-platform-metrics.md) and then deleted. To persist your cache metrics for longer than 30 days, select **Diagnostics** to [configure the storage account](cache-how-to-monitor.md#export-cache-metrics) used to store cache diagnostics.
>[!NOTE] >In addition to archiving your cache metrics to storage, you can also [stream them to an Event hub or send them to Azure Monitor logs](../azure-monitor/essentials/stream-monitoring-data-event-hubs.md).
By default, cache metrics in Azure Monitor are [stored for 30 days](../azure-mon
> ## Support & troubleshooting settings+ The settings in the **Support + troubleshooting** section provide you with options for resolving issues with your cache. ![Support + troubleshooting](./media/cache-configure/redis-cache-support-troubleshooting.png)
The settings in the **Support + troubleshooting** section provide you with optio
* [New support request](#new-support-request) ### Resource health+ **Resource health** watches your resource and tells you if it's running as expected. For more information about the Azure Resource health service, see [Azure Resource health overview](../service-health/resource-health-overview.md). > [!NOTE]
The settings in the **Support + troubleshooting** section provide you with optio
> ### New support request
-Click **New support request** to open a support request for your cache.
---
+Select **New support request** to open a support request for your cache.
## Default Redis server configuration+ New Azure Cache for Redis instances are configured with the following default Redis configuration values: > [!NOTE]
New Azure Cache for Redis instances are configured with the following default Re
| `maxmemory-samples` |3 |To save memory, LRU and minimal TTL algorithms are approximated algorithms instead of precise algorithms. By default Redis checks three keys and picks the one that was used less recently. | | `lua-time-limit` |5,000 |Max execution time of a Lua script in milliseconds. If the maximum execution time is reached, Redis logs that a script is still in execution after the maximum allowed time, and starts to reply to queries with an error. | | `lua-event-limit` |500 |Max size of script event queue. |
-| `client-output-buffer-limit` `normalclient-output-buffer-limit` `pubsub` |0 0 032mb 8mb 60 |The client output buffer limits can be used to force disconnection of clients that are not reading data from the server fast enough for some reason (a common reason is that a Pub/Sub client can't consume messages as fast as the publisher can produce them). For more information, see [https://redis.io/topics/clients](https://redis.io/topics/clients). |
+| `client-output-buffer-limit` `normalclient-output-buffer-limit` `pubsub` |0 0 032mb 8mb 60 |The client output buffer limits can be used to force disconnection of clients that are not reading data from the server fast enough for some reason. A common reason is that a Pub/Sub client can't consume messages as fast as the publisher can produce them. For more information, see [https://redis.io/topics/clients](https://redis.io/topics/clients). |
<a name="databases"></a>+ <sup>1</sup>The limit for `databases` is different for each Azure Cache for Redis pricing tier and can be set at cache creation. If no `databases` setting is specified during cache creation, the default is 16. * Basic and Standard caches
For more information about databases, see [What are Redis databases?](cache-deve
> <a name="maxclients"></a>+ <sup>2</sup>`maxclients` is different for each Azure Cache for Redis pricing tier. * Basic and Standard caches
For more information about databases, see [What are Redis databases?](cache-deve
> > -- ## Redis commands not supported in Azure Cache for Redis+ > [!IMPORTANT] > Because configuration and management of Azure Cache for Redis instances is managed by Microsoft, the following commands are disabled. If you try to invoke them, you receive an error message similar to `"(error) ERR unknown command"`. >
For more information about databases, see [What are Redis databases?](cache-deve
For more information about Redis commands, see [https://redis.io/commands](https://redis.io/commands). ## Redis console+ You can securely issue commands to your Azure Cache for Redis instances using the **Redis Console**, which is available in the Azure portal for all cache tiers. > [!IMPORTANT]
-> - The Redis Console does not work with [VNET](cache-how-to-premium-vnet.md). When your cache is part of a VNET, only clients in the VNET can access the cache. Because Redis Console runs in your local browser, which is outside the VNET, it can't connect to your cache.
-> - Not all Redis commands are supported in Azure Cache for Redis. For a list of Redis commands that are disabled for Azure Cache for Redis, see the previous [Redis commands not supported in Azure Cache for Redis](#redis-commands-not-supported-in-azure-cache-for-redis) section. For more information about Redis commands, see [https://redis.io/commands](https://redis.io/commands).
+>
+> * The Redis Console does not work with [VNET](cache-how-to-premium-vnet.md). When your cache is part of a VNET, only clients in the VNET can access the cache. Because Redis Console runs in your local browser, which is outside the VNET, it can't connect to your cache.
+> * Not all Redis commands are supported in Azure Cache for Redis. For a list of Redis commands that are disabled for Azure Cache for Redis, see the previous [Redis commands not supported in Azure Cache for Redis](#redis-commands-not-supported-in-azure-cache-for-redis) section. For more information about Redis commands, see [https://redis.io/commands](https://redis.io/commands).
> >
-To access the Redis Console, click **Console** from the **Azure Cache for Redis** blade.
+To access the Redis Console, select **Console** from the **Azure Cache for Redis** on the left.
![Screenshot that highlights the Console button.](./media/cache-configure/redis-console-menu.png)
-To issue commands against your cache instance, type the desired command into the console.
+To issue commands against your cache instance, type the command you want into the console.
![Screenshot thas shows the Redis Console with the input command and results.](./media/cache-configure/redis-console.png) - ### Using the Redis Console with a premium clustered cache
-When using the Redis Console with a premium clustered cache, you can issue commands to a single shard of the cache. To issue a command to a specific shard, first connect to the desired shard by clicking it on the shard picker.
+When using the Redis Console with a premium clustered cache, you can issue commands to a single shard of the cache. To issue a command to a specific shard, first connect to the shard you want by selecting it on the shard picker.
![Redis console](./media/cache-configure/redis-console-premium-cluster.png)
shard1>get myKey
In the previous example, shard 1 is the selected shard, but `myKey` is located in shard 0, as indicated by the `(shard 0)` portion of the error message. In this example, to access `myKey`, select shard 0 using the shard picker, and then issue the desired command. - ## Move your cache to a new subscription
-You can move your cache to a new subscription by clicking **Move**.
+
+You can move your cache to a new subscription by selecting **Move**.
![Move Azure Cache for Redis](./media/cache-configure/redis-cache-move.png) For information on moving resources from one resource group to another, and from one subscription to another, see [Move resources to new resource group or subscription](../azure-resource-manager/management/move-resource-group-and-subscription.md). ## Next steps+ * For more information on working with Redis commands, see [How can I run Redis commands?](cache-development-faq.md#how-can-i-run-redis-commands)
azure-cache-for-redis Cache High Availability https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-high-availability.md
Azure Cache for Redis implements high availability by using multiple VMs, called
| Option | Description | Availability | Standard | Premium | Enterprise | | - | - | - | :: | :: | :: | | [Standard replication](#standard-replication)| Dual-node replicated configuration in a single datacenter with automatic failover | 99.9% (see [details](https://azure.microsoft.com/support/legal/sla/cache/v1_0/)) |Γ£ö|Γ£ö|-|
-| [Zone redundancy](#zone-redundancy) | Multi-node replicated configuration across AZs, with automatic failover | Up to 99.99% (see [details](https://azure.microsoft.com/support/legal/sla/cache/v1_0/)) |-|Preview|Γ£ö|
+| [Zone redundancy](#zone-redundancy) | Multi-node replicated configuration across AZs, with automatic failover | Up to 99.99% (see [details](https://azure.microsoft.com/support/legal/sla/cache/v1_0/)) |-|Γ£ö|Γ£ö|
| [Geo-replication](#geo-replication) | Linked cache instances in two regions, with user-controlled failover | Up to 99.999% (see [details](https://azure.microsoft.com/support/legal/sla/cache/v1_0/)) |-|Γ£ö|Preview| ## Standard replication
If the primary node in a Redis cache is unavailable, the replica will promote it
A primary node can go out of service as part of a planned maintenance activity such as Redis software or operating system update. It also can stop working because of unplanned events such as failures in underlying hardware, software, or network. [Failover and patching for Azure Cache for Redis](cache-failover.md) provides a detailed explanation on types of Redis failovers. An Azure Cache for Redis will go through many failovers during its lifetime. The high availability architecture is designed to make these changes inside a cache as transparent to its clients as possible.
->[!NOTE]
->The following is available as a preview.
->
->
- In addition, Azure Cache for Redis allows additional replica nodes in the Premium tier. A [multi-replica cache](cache-how-to-multi-replicas.md) can be configured with up to three replica nodes. Having more replicas generally improves resiliency because of the additional nodes backing up the primary. Even with more replicas, an Azure Cache for Redis instance still can be severely impacted by a datacenter- or AZ-level outage. You can increase cache availability by using multiple replicas in conjunction with [zone redundancy](#zone-redundancy). ## Zone redundancy
Azure Cache for Redis supports zone redundant configurations in the Premium and
### Premium tier
->[!NOTE]
->This is available as a preview.
->
->
- The following diagram illustrates the zone redundant configuration for the Premium tier: :::image type="content" source="media/cache-high-availability/zone-redundancy.png" alt-text="Zone redundancy setup":::
azure-cache-for-redis Cache How To Multi Replicas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-multi-replicas.md
Title: Add replicas to Azure Cache for Redis (Preview)
+ Title: Add replicas to Azure Cache for Redis
description: Learn how to add more replicas to your Premium tier Azure Cache for Redis instances
Last updated 08/11/2020
-# Add replicas to Azure Cache for Redis (Preview)
+# Add replicas to Azure Cache for Redis
In this article, you'll learn how to set up an Azure Cache instance with additional replicas using the Azure portal. Azure Cache for Redis Standard and Premium tiers offer redundancy by hosting each cache on two dedicated virtual machines (VMs). These VMs are configured as primary and replica. When the primary VM becomes unavailable, the replica detects that and takes over as the new primary automatically. You can now increase the number of replicas in a Premium cache up to three, giving you a total of four VMs backing a cache. Having multiple replicas results in higher resilience than what a single replica can provide.
-> [!IMPORTANT]
-> This preview is provided without a service level agreement, and it's not recommended for production workloads. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews.](https://azure.microsoft.com/support/legal/preview-supplemental-terms/)
->
- ## Prerequisites * Azure subscription - [create one for free](https://azure.microsoft.com/free/)
-> [!NOTE]
-> This feature is currently in preview - [contact us](mailto:azurecache@microsoft.com) if you're interested.
->
- ## Create a cache To create a cache, follow these steps:
azure-cache-for-redis Cache How To Zone Redundancy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-zone-redundancy.md
Title: Enable zone redundancy for Azure Cache for Redis (Preview)
+ Title: Enable zone redundancy for Azure Cache for Redis
description: Learn how to set up zone redundancy for your Premium and Enterprise tier Azure Cache for Redis instances
Last updated 08/11/2020
-# Enable zone redundancy for Azure Cache for Redis (Preview)
+# Enable zone redundancy for Azure Cache for Redis
In this article, you'll learn how to configure a zone-redundant Azure Cache instance using the Azure portal. Azure Cache for Redis Standard, Premium, and Enterprise tiers provide built-in redundancy by hosting each cache on two dedicated virtual machines (VMs). Even though these VMs are located in separate [Azure fault and update domains](../virtual-machines/availability.md) and highly available, they're susceptible to datacenter level failures. Azure Cache for Redis also supports zone redundancy in its Premium and Enterprise tiers. A zone-redundant cache runs on VMs spread across multiple [availability zones](../availability-zones/az-overview.md). It provides higher resilience and availability.
Azure Cache for Redis Standard, Premium, and Enterprise tiers provide built-in r
## Prerequisites * Azure subscription - [create one for free](https://azure.microsoft.com/free/)
-> [!NOTE]
-> This feature is currently in preview - [contact us](mailto:azurecache@microsoft.com) if you're interested.
->
- ## Create a cache To create a cache, follow these steps:
azure-cache-for-redis Cache Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-overview.md
The [Azure Cache for Redis Pricing](https://azure.microsoft.com/pricing/details/
| [Scaling](cache-how-to-scale.md) |Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| | [OSS cluster](cache-how-to-premium-clustering.md) |-|-|Γ£ö|Γ£ö|Γ£ö| | [Data persistence](cache-how-to-premium-persistence.md) |-|-|Γ£ö|Preview|Preview|
-| [Zone redundancy](cache-how-to-zone-redundancy.md) |-|-|Preview|Γ£ö|Γ£ö|
+| [Zone redundancy](cache-how-to-zone-redundancy.md) |-|-|Γ£ö|Γ£ö|Γ£ö|
| [Geo-replication](cache-how-to-geo-replication.md) |-|-|Γ£ö|Preview|Preview| | [Modules](https://redis.io/modules) |-|-|-|Γ£ö|Γ£ö| | [Import/Export](cache-how-to-import-export-data.md) |-|-|Γ£ö|Γ£ö|Γ£ö|
azure-cache-for-redis Cache Rust Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-rust-get-started.md
Last updated 01/08/2021
# Quickstart: Use Azure Cache for Redis with Rust
-In this article, you will learn how to use the [Rust programming language](https://www.rust-lang.org/) for interacting with [Azure Cache for Redis](./cache-overview.md). It will demonstrate examples of commonly used Redis data structures such as [String](https://redis.io/topics/data-types-intro#redis-strings), [Hash](https://redis.io/topics/data-types-intro#redis-hashes), [List](https://redis.io/topics/data-types-intro#redis-lists) etc. using the [redis-rs](https://github.com/mitsuhiko/redis-rs) library for Redis. This client exposes both high and low-level APIs and you will see both these styles in action with the help of sample code presented in this article.
+In this article, you'll learn how to use the [Rust programming language](https://www.rust-lang.org/) to interact with [Azure Cache for Redis](./cache-overview.md). You'll also learn about commonly used Redis data structures:
+
+* [String](https://redis.io/topics/data-types-intro#redis-strings)
+* [Hash](https://redis.io/topics/data-types-intro#redis-hashes)
+* [List](https://redis.io/topics/data-types-intro#redis-lists)
+
+You'll use the [redis-rs](https://github.com/mitsuhiko/redis-rs) library for Redis in this sample. This client exposes both high-level and low-level APIs, and you'll see both these styles in action.
## Skip to the code on GitHub
If you're interested in learning how the code works, you can review the followin
The `connect` function is used to establish a connection to Azure Cache for Redis. It expects host name and the password (Access Key) to be passed in via environment variables `REDIS_HOSTNAME` and `REDIS_PASSWORD` respectively. The format for the connection URL is `rediss://<username>:<password>@<hostname>` - Azure Cache for Redis only accepts secure connections with [TLS 1.2 as the minimum required version](cache-remove-tls-10-11.md).
-The call to [redis::Client::open](https://docs.rs/redis/0.19.0/redis/struct.Client.html#method.open) performs basic validation while [get_connection()](https://docs.rs/redis/0.19.0/redis/struct.Client.html#method.get_connection) actually initiates the connection - the program stops if the connectivity fails due to any reason such as an incorrect password.
+The call to [redis::Client::open](https://docs.rs/redis/0.19.0/redis/struct.Client.html#method.open) does basic validation while [get_connection()](https://docs.rs/redis/0.19.0/redis/struct.Client.html#method.get_connection) actually starts the connection. The program stops if the connectivity fails for any reason. For example, one reason might be an incorrect password.
```rust fn connect() -> redis::Connection {
fn connect() -> redis::Connection {
} ```
-The `basics` function covers [SET](https://redis.io/commands/set), [GET](https://redis.io/commands/get), and [INCR](https://redis.io/commands/incr) commands. The low-level API is used for `SET` and `GET`, which sets and retrieves the value for a key named `foo`. The `INCRBY` command is executed using a high-level API i.e. [incr](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.incr) increments the value of a key (named `counter`) by `2` followed by a call to [get](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.get) to retrieve it.
+The function `basics` covers the [SET](https://redis.io/commands/set), [GET](https://redis.io/commands/get), and [INCR](https://redis.io/commands/incr) commands.
+
+The low-level API is used for `SET` and `GET`, which sets and retrieves the value for a key named `foo`.
+
+The `INCRBY` command is executed using a high-level API that is, [incr](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.incr) increments the value of a key (named `counter`) by `2` followed by a call to [get](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.get) to retrieve it.
```rust fn basics() {
fn basics() {
} ```
-The below code snippet demonstrates the functionality of a Redis `HASH` data structure. [HSET](https://redis.io/commands/hset) is invoked using the low-level API to store information (`name`, `version`, `repo`) about Redis drivers (clients). For example, details for the Rust driver (one being used in this sample code!) is captured in form of a [BTreeMap](https://doc.rust-lang.org/std/collections/struct.BTreeMap.html) and then passed on to the low-level API. It is then retrieved using [HGETALL](https://redis.io/commands/hgetall).
+The below code snippet demonstrates the functionality of a Redis `HASH` data structure. [HSET](https://redis.io/commands/hset) is invoked using the low-level API to store information (`name`, `version`, `repo`) about Redis drivers (clients). For example, details for the Rust driver (one being used in this sample code!) is captured in form of a [BTreeMap](https://doc.rust-lang.org/std/collections/struct.BTreeMap.html) and then passed on to the low-level API. It's then retrieved using [HGETALL](https://redis.io/commands/hgetall).
`HSET` can also be executed using a high-level API using [hset_multiple](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.hset_multiple) that accepts an array of tuples. [hget](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.hget) is then executed to fetch the value for a single attribute (the `repo` in this case).
fn hash() {
} ```
-In the function below, you can see how to use a `LIST` data structure. [LPUSH](https://redis.io/commands/lpush) is executed (with the low-level API) to add an entry to the list and the high-level [lpop](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.lpop) method is used to retrieve that from the list. Then, the [rpush](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.rpush) method is used to add a couple of entries to the list which are then fetched using the low-level [lrange](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.lrange) method.
+In the function below, you can see how to use a `LIST` data structure. [LPUSH](https://redis.io/commands/lpush) is executed (with the low-level API) to add an entry to the list and the high-level [lpop](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.lpop) method is used to retrieve that from the list. Then, the [rpush](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.rpush) method is used to add a couple of entries to the list, which are then fetched using the low-level [lrange](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.lrange) method.
```rust fn list() {
fn set() {
} ```
-`sorted_set` function below demonstrates the Sorted Set data structure. [ZADD](https://redis.io/commands/zadd) is invoked (with the low-level API) to add a random integer score for a player (`player-1`). Next, the [zadd](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.zadd) method (high-level API) is used to add more players (`player-2` to `player-5`) and their respective (randomly generated) scores. The number of entries in the sorted set is figured out using [ZCARD](https://redis.io/commands/zcard) and that's used as the limit to the [ZRANGE](https://redis.io/commands/zrange) command (invoked with the low-level API) to list out the players with their scores in ascending order.
+`sorted_set` function below demonstrates the Sorted Set data structure. [ZADD](https://redis.io/commands/zadd) is invoked with the low-level API to add a random integer score for a player (`player-1`). Next, the [zadd](https://docs.rs/redis/0.19.0/redis/trait.Commands.html#method.zadd) method (high-level API) is used to add more players (`player-2` to `player-5`) and their respective (randomly generated) scores. The number of entries in the sorted set is determined using [ZCARD](https://redis.io/commands/zcard). That's used as the limit to the [ZRANGE](https://redis.io/commands/zrange) command (invoked with the low-level API) to list out the players with their scores in ascending order.
```rust fn sorted_set() {
Start by cloning the application from GitHub.
md "C:\git-samples" ```
-1. Open a git terminal window, such as git bash. Use the `cd` command to change into the new folder where you will be cloning the sample app.
+1. Open a git terminal window, such as git bash. Use the `cd` to change into the new folder where you'll be cloning the sample app.
```bash cd "C:\git-samples"
The application accepts connectivity and credentials in the form of environment
cargo run ```
- You will see an output as such:
+ You'll see this output:
```bash ******* Running SET, GET, INCR commands *******
The application accepts connectivity and credentials in the form of environment
## Clean up resources
-If you're finished with the Azure resource group and resources you created in this quickstart, you can delete them to avoid charges.
+You can delete the resource group and resources when you're finished with them. By deleting what you created in this quickstart, you avoid being charged for them.
> [!IMPORTANT] > Deleting a resource group is irreversible, and the resource group and all the resources in it are permanently deleted. If you created your Azure Cache for Redis instance in an existing resource group that you want to keep, you can delete just the cache by selecting **Delete** from the cache **Overview** page.
If you're finished with the Azure resource group and resources you created in th
To delete the resource group and its Redis Cache for Azure instance: 1. From the [Azure portal](https://portal.azure.com), search for and select **Resource groups**.
-1. In the **Filter by name** text box, enter the name of the resource group that contains your cache instance, and then select it from the search results.
+1. In the **Filter by name** text box, enter the name of the resource group that contains your cache instance. Then, select it from the search results.
1. On your resource group page, select **Delete resource group**. 1. Type the resource group name, and then select **Delete**.
azure-functions Functions Bindings Service Bus https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-service-bus.md
The example host.json file below contains only the settings for version 5.0.0 an
"maxConcurrentCalls": 32, "maxConcurrentSessions": 10, "maxMessages": 2000,
- "sessionIdleTimeout": "00:01:00",
- "maxAutoLockRenewalDuration": "00:05:00"
+ "sessionIdleTimeout": "00:01:00"
} } }
In addition to the above configuration properties when using version 5.x and hig
## Next steps - [Run a function when a Service Bus queue or topic message is created (Trigger)](./functions-bindings-service-bus-trigger.md)-- [Send Azure Service Bus messages from Azure Functions (Output binding)](./functions-bindings-service-bus-output.md)
+- [Send Azure Service Bus messages from Azure Functions (Output binding)](./functions-bindings-service-bus-output.md)
azure-functions Functions Debug Event Grid Trigger Local https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-debug-event-grid-trigger-local.md
This article demonstrates how to debug a local function that handles an Azure Ev
## Prerequisites - Create or use an existing function app-- Create or use an existing storage account
+- Create or use an existing storage account. Event Grid notification subscription can be set on Azure Storage accounts for `BlobStorage`, `StorageV2`, or [Data Lake Storage Gen2](/azure/storage/blobs/data-lake-storage-introduction).
- Download [ngrok](https://ngrok.com/) to allow Azure to call your local function ## Create a new function
azure-functions Functions How To Azure Devops https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-how-to-azure-devops.md
steps:
You must include one of the following YAML samples in your YAML file, depending on the hosting OS.
-#### Windows function app
+# [Windows](#tab/windows)
You can use the following snippet to deploy a Windows function app:
steps:
#slotName: '<Slot name>' ```
-#### Linux function app
+# [Linux](#tab/linux)
You can use the following snippet to deploy a Linux function app:
steps:
#resourceGroupName: '<Resource Group Name>' #slotName: '<Slot name>' ```+ ## Template-based pipeline
To create a build pipeline in Azure, use the `az functionapp devops-pipeline cre
## Next steps - Review the [Azure Functions overview](functions-overview.md).-- Review the [Azure DevOps overview](/azure/devops/pipelines/).
+- Review the [Azure DevOps overview](/azure/devops/pipelines/).
azure-functions Functions Reference Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-reference-java.md
The following example shows the operating system setting in the `runtime` sectio
## JDK runtime availability and support
-For local development of Java function apps, download and use the appropriate [Azul Zulu Enterprise for Azure](https://assets.azul.com/files/Zulu-for-Azure-FAQ.pdf) Java JDKs from [Azul Systems](https://www.azul.com/downloads/azure-only/zulu/). Azure Functions uses an Azul Java JDK runtime when you deploy your function app to the cloud.
+For local development of Java function apps, download and use the appropriate Azul Zulu Enterprise for Azure Java JDKs from [Azul Systems](https://www.azul.com/downloads/azure-only/zulu/). Azure Functions uses an Azul Java JDK runtime when you deploy your function app to the cloud.
[Azure support](https://azure.microsoft.com/support/) for issues with the JDKs and function apps is available with a [qualified support plan](https://azure.microsoft.com/support/plans/).
azure-government Azure Services In Fedramp Auditscope https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/compliance/azure-services-in-fedramp-auditscope.md
Title: Azure Services in FedRAMP and DoD SRG Audit Scope
description: This article contains tables for Azure Public and Azure Government that illustrate what FedRAMP (Moderate vs. High) and DoD SRG (Impact level 2, 4, 5 or 6) audit scope a given service has reached. Previously updated : 05/13/2021 Last updated : 05/17/2021
This article provides a detailed list of in-scope cloud services across Azure Pu
| [Azure Advisor](https://azure.microsoft.com/services/advisor/) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: | | [Azure Analysis Services](https://azure.microsoft.com/services/analysis-services/) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | [Azure API for FHIR](https://azure.microsoft.com/services/azure-api-for-fhir/) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: |
+| [Azure App Configuration](https://azure.microsoft.com/services/app-configuration/) | :heavy_check_mark: | | | | :heavy_check_mark: |
| [Azure Bastion](https://azure.microsoft.com/services/azure-bastion/) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: | | [Azure Blueprints](https://azure.microsoft.com/services/blueprints/) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: | | [Azure Bot Service](/azure/bot-service/) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | :heavy_check_mark: |
azure-monitor Alerts Troubleshoot Log https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/alerts/alerts-troubleshoot-log.md
A common issue is that you think that the alert didn't fire the actions because
![Suppress alerts](media/alerts-troubleshoot-log/LogAlertSuppress.png)
+### Alert scope resource has been moved, renamed, or deleted
+
+When you author an alert rule, Log Analytics creates a permission snapshot for your user ID. This snapshot is saved in the rule and contains the rule scope resource Azure Resource Manager ID. If the rule scope resource moves, gets renamed, or deleted, all log alert rules referring to that resource will break. Alert rules will need to be recreated using the new Azure Resource Manager ID to work.
+ ### Metric measurement alert rule with splitting using the legacy Log Analytics API [Metric measurement](alerts-unified-log.md#calculation-of-measure-based-on-a-numeric-column-such-as-cpu-counter-value) is a type of log alert that is based on summarized time series results. These rules allow grouping by columns to [split alerts](alerts-unified-log.md#split-by-alert-dimensions). If you're using the legacy Log Analytics API, splitting won't work as expected. Choosing the grouping in the legacy API isn't supported.
azure-monitor Asp Net Core https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/asp-net-core.md
The [Application Insights SDK for ASP.NET Core](https://nuget.org/packages/Micro
- A valid Application Insights instrumentation key. This key is required to send any telemetry to Application Insights. If you need to create a new Application Insights resource to get an instrumentation key, see [Create an Application Insights resource](./create-new-resource.md). > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=net) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=net) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
## Enable Application Insights server-side telemetry (Visual Studio)
azure-monitor Asp Net Troubleshoot No Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/asp-net-troubleshoot-no-data.md
Internet Information Services (IIS) logs counts of all request reaching IIS and
* See [Troubleshooting Status Monitor](./monitor-performance-live-website-now.md#troubleshoot). > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=net) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=net) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
## FileNotFoundException: Could not load file or assembly 'Microsoft.AspNet TelemetryCorrelation
azure-monitor Asp Net https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/asp-net.md
If you don't have an Azure subscription, create a [free](https://azure.microsoft
- Create an [Application Insights workspace-based resource](create-workspace-resource.md). > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=net) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=net) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
## Create a basic ASP.NET web app
azure-monitor Console https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/console.md
You need a subscription with [Microsoft Azure](https://azure.com). Sign in with
## Getting started > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=net) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=net) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
* In the [Azure portal](https://portal.azure.com), [create an Application Insights resource](./create-new-resource.md). For application type, choose **General**. * Take a copy of the Instrumentation Key. Find the key in the **Essentials** drop-down of the new resource you created.
azure-monitor Create New Resource https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/create-new-resource.md
When your app has been created, a new pane opens. This pane is where you see per
The instrumentation key identifies the resource that you want to associate your telemetry data with. You will need to copy the instrumentation key and add it to your application's code. > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=net) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
## Install the SDK in your app
azure-monitor Ilogger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/ilogger.md
This code is required only when you use a standalone logging provider. For regul
``` > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=net) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=net) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
## Next steps
azure-monitor Ip Collection https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/ip-collection.md
To enable IP collection and storage, the `DisableIpMasking` property of the Appl
} ```
-### Portal
+### Portal
If you only need to modify the behavior for a single Application Insights resource, use the Azure portal.
If you only need to modify the behavior for a single Application Insights resour
> [!WARNING] > If you experience an error that says: **_The resource group is in a location that is not supported by one or more resources in the template. Please choose a different resource group._** Temporarily select a different resource group from the dropdown and then re-select your original resource group to resolve the error.
-5. Select **I agree** > **Purchase**.
+5. Select **Review + create** > **Create**.
- ![Checked box with words "I agree to the terms and conditions stated above" highlighted in red above a button with the word "Purchase" highlighted in red.](media/ip-collection/purchase.png)
-
- In this case, nothing new is actually being purchased. We're only updating the configuration of the existing Application Insights resource.
+ > [!NOTE]
+ > If you see "Your deployment failed", look through your deployment details for the one with type "microsoft.insights/components" and check the status. If that one succeeds then the changes made to DisableIpMasking were deployed.
6. Once the deployment is complete, new telemetry data will be recorded.
- If you select and edit the template again, you'll only see the default template without the newly added property. If you aren't seeing IP address data and want to confirm that `"DisableIpMasking": true` is set, run the following PowerShell:
+ If you select and edit the template again, you'll only see the default template without the newly added property. If you aren't seeing IP address data and want to confirm that `"DisableIpMasking": true` is set, run the following PowerShell:
```powershell # Replace `Fabrikam-dev` with the appropriate resource and resource group name.
azure-monitor Java 2X Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/java-2x-get-started.md
Application Insights is an extensible analytics service for web developers that
## Get an Application Insights instrumentation key > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=java) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=java) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
1. Sign in to the [Azure portal](https://portal.azure.com/). 2. In the Azure portal, create an Application Insights resource. Set the application type to Java web application.
azure-monitor Javascript https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/javascript.md
Application Insights can be used with any web pages - you just add a short piece
## Adding the JavaScript SDK > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=js) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=js) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
1. First you need an Application Insights resource. If you don't already have a resource and instrumentation key, follow the [create a new resource instructions](create-new-resource.md). 2. Copy the _instrumentation key_ (also known as "iKey") or [connection string](#connection-string-setup) for the resource where you want your JavaScript telemetry to be sent (from step 1.) You will add it to the `instrumentationKey` or `connectionString` setting of the Application Insights JavaScript SDK.
azure-monitor Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/nodejs.md
Before you begin, make sure that you have an Azure subscription, or [get a new o
Include the SDK in your app, so it can gather data. > [!IMPORTANT]
-> New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./sdk-connection-string.md?tabs=nodejs) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+> [Connection Strings](./sdk-connection-string.md?tabs=nodejs) are recommended over instrumentation keys. New Azure regions **require** the use of connection strings instead of instrumentation keys. Connection string identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
1. Copy your resource's instrumentation Key (also called an *ikey*) from your newly created resource. Application Insights uses the ikey to map data to your Azure resource. Before the SDK can use your ikey, you must specify the ikey in an environment variable or in your code.
azure-monitor Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/faq.md
For syslog on Linux, you can choose Facilities and log level for each facility t
### Does the new Azure Monitor agent support sending data to EventHubs and Azure Storage Accounts? Not yet, but the new agent along with Data Collection Rules will support sending data to both Event Hubs as well as Azure Storage accounts in the future. Watch out for announcements in Azure Updates or join the [Teams channel](https://teams.microsoft.com/l/team/19%3af3f168b782f64561b52abe75e59e83bc%40thread.tacv2/conversations?groupId=770d6aa5-c2f7-4794-98a0-84fd6ae7f193&tenantId=72f988bf-86f1-41af-91ab-2d7cd011db47) for frequent updates, support, news and more! -
+### Does the new Azure Monitor agent have hardening support for Linux?
+This is not available currently for the agent in preview, and is planned to be added post GA.
## Visualizations
In Solution Explorer, right-click `ApplicationInsights.config` and choose **Upda
New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./app/sdk-connection-string.md) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
+### Should I use connection strings or instrumentation keys?
+
+[Connection Strings](./app/sdk-connection-string.md) are recommended over instrumentation keys.
+ ### Can I use `providers('Microsoft.Insights', 'components').apiVersions[0]` in my Azure Resource Manager deployments? We do not recommend using this method of populating the API version. The newest version can represent preview releases which may contain breaking changes. Even with newer non-preview releases, the API versions are not always backwards compatible with existing templates, or in some cases the API version may not be available to all subscriptions.
azure-monitor Logs Data Export https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/logs-data-export.md
Data is sent to your event hub in near-real-time as it reaches Azure Monitor. An
> The [number of supported event hubs per 'Basic' and 'Standard' namespaces tiers is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). If you export more than 10 tables, either split the tables between several export rules to different event hub namespaces, or provide event hub name in the export rule and export all tables to that event hub. Considerations:
-1. The 'Basic' event hub SKU supports a lower event size [limit](../../event-hubs/event-hubs-quotas.md#basic-vs-standard-tiers) and some logs in your workspace can exceed it and be dropped. We recommend using a 'Standard' or 'Dedicated' event hub as an export destination.
+1. The 'Basic' event hub SKU supports a lower event size [limit](../../event-hubs/event-hubs-quotas.md#basic-vs-standard-vs-dedicated-tiers) and some logs in your workspace can exceed it and be dropped. We recommend using a 'Standard' or 'Dedicated' event hub as an export destination.
2. The volume of exported data often increases over time, and the event hub scale needs to be increased to handle larger transfer rates and avoid throttling scenarios and data latency. You should use the auto-inflate feature of Event Hubs to automatically scale up and increase the number of throughput units to meet usage needs. See [Automatically scale up Azure Event Hubs throughput units](../../event-hubs/event-hubs-auto-inflate.md) for details. ## Prerequisites
azure-netapp-files Azure Netapp Files Mount Unmount Volumes For Virtual Machines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/azure-netapp-files-mount-unmount-volumes-for-virtual-machines.md
Previously updated : 11/17/2020 Last updated : 05/17/2021 # Mount or unmount a volume for Windows or Linux virtual machines
You can mount or unmount a volume for Windows or Linux virtual machines as neces
* If you are mounting an NFS volume, ensure that you use the `vers` option in the `mount` command to specify the NFS protocol version that corresponds to the volume you want to mount. * If you are using NFSv4.1, use the following command to mount your file system: `sudo mount -t nfs -o rw,hard,rsize=65536,wsize=65536,vers=4.1,tcp,sec=sys $MOUNTTARGETIPADDRESS:/$VOLUMENAME $MOUNTPOINT` > [!NOTE]
- > If you use NFSv4.1, ensure that all VMs mounting the export use unique hostnames.
+ > If you use NFSv4.1 and your use case involves leveraging VMs with the same hostnames (for example, in a DR test), see [Configure two VMs with the same hostname to access NFSv4.1 volumes](configure-nfs-clients.md#configure-two-vms-with-the-same-hostname-to-access-nfsv41-volumes).
3. If you want to have an NFS volume automatically mounted when an Azure VM is started or rebooted, add an entry to the `/etc/fstab` file on the host.
You can mount or unmount a volume for Windows or Linux virtual machines as neces
* [NFS FAQs](./azure-netapp-files-faqs.md#nfs-faqs) * [Network File System overview](/windows-server/storage/nfs/nfs-overview) * [Mount an NFS Kerberos volume](configure-kerberos-encryption.md#kerberos_mount)
+* [Configure two VMs with the same hostname to access NFSv4.1 volumes](configure-nfs-clients.md#configure-two-vms-with-the-same-hostname-to-access-nfsv41-volumes)
azure-netapp-files Configure Nfs Clients https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/configure-nfs-clients.md
na ms.devlang: na Previously updated : 05/10/2021 Last updated : 05/17/2021 # Configure an NFS client for Azure NetApp Files
The following example queries the AD LDAP server from Ubuntu LDAP client for an
`root@cbs-k8s-varun4-04:/home/cbs# getent passwd hari1` `hari1:*:1237:1237:hari1:/home/hari1:/bin/bash`
+## Configure two VMs with the same hostname to access NFSv4.1 volumes
+
+This section explains how you can configure two VMs that have the same hostname to access Azure NetApp Files NFSv4.1 volumes. This procedure can be useful when you conduct a disaster recovery (DR) test and require a test system with the same hostname as the primary DR system. This procedure is only required when you have the same hostname on two VMs that are accessing the same Azure NetApp Files volumes.
+
+NFSv4.x requires each client to identify itself to servers with a *unique* string. File open and lock state shared between one client and one server is associated with this identity. To support robust NFSv4.x state recovery and transparent state migration, this identity string must not change across client reboots.
+
+1. Display the `nfs4_unique_id` string on the VM clients by using the following command:
+
+ `# systool -v -m nfs | grep -i nfs4_unique`
+ ` nfs4_unique_id = ""`
+
+ To mount the same volume on an additional VM with the same hostname, for example the DR system, create a `nfs4_unique_id` so it can uniquely identify itself to the Azure NetApp Files NFS service. This step allows the service to distinguish between the two VMs with the same hostname and enable mounting NFSv4.1 volumes on both VMs.
+
+ You need to perform this step on the test DR system only. For consistency, you can consider applying a unique setting on each involved virtual machine.
+
+2. On the test DR system, add the following line to the `nfsclient.conf` file, typically located in `/etc/modprobe.d/`:
+
+ `options nfs nfs4_unique_id=uniquenfs4-1`
+
+ The string `uniquenfs4-1` can be any alphanumeric string, as long as it is unique across the VMs to be connected to the service.
+
+ Check your distributionΓÇÖs documentation about how to configure NFS client settings.
+
+ Reboot the VM for the change to take effect.
+
+3. On the test DR system, verify that `nfs4_unique_id` has been set after the VM reboot:
+
+ `# systool -v -m nfs | grep -i nfs4_unique`
+ ` nfs4_unique_id = "uniquenfs4-1"`
+
+4. [Mount the NFSv4.1 volume](azure-netapp-files-mount-unmount-volumes-for-virtual-machines.md) on both VMs as normal.
+
+ Both VMs with the same hostname can now mount and access the NFSv4.1 volume.
## Next steps * [Create an NFS volume for Azure NetApp Files](azure-netapp-files-create-volumes.md) * [Create a dual-protocol volume for Azure NetApp Files](create-volumes-dual-protocol.md)-
+* [Mount or unmount a volume for Windows or Linux virtual machines](azure-netapp-files-mount-unmount-volumes-for-virtual-machines.md)
azure-netapp-files Create Active Directory Connections https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/create-active-directory-connections.md
na ms.devlang: na Previously updated : 04/06/2021 Last updated : 05/17/2021 # Create and manage Active Directory connections for Azure NetApp Files
This setting is configured in the **Active Directory Connections** under **NetAp
![Join Active Directory](../media/azure-netapp-files/azure-netapp-files-join-active-directory.png) * **AES Encryption**
- Select this checkbox if you want to enable AES encryption for an SMB volume. See [Requirements for Active Directory connections](#requirements-for-active-directory-connections) for requirements.
-
+ Select this checkbox if you want to enable AES encryption for AD authentication or if you require [encryption for SMB volumes](azure-netapp-files-create-volumes-smb.md#add-an-smb-volume).
+
+ See [Requirements for Active Directory connections](#requirements-for-active-directory-connections) for requirements.
+
![Active Directory AES encryption](../media/azure-netapp-files/active-directory-aes-encryption.png) The **AES Encryption** feature is currently in preview. If this is your first time using this feature, register the feature before using it:
azure-netapp-files Cross Region Replication Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/cross-region-replication-introduction.md
Azure NetApp Files volume replication is supported between various [Azure region
* North Europe and West Europe * UK South and UK West
-### Azure regional non-pairs
+### Azure regional non-standard pairs
* West US 2 and East US * South Central US and Central US
azure-percept How To Select Update Package https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-select-update-package.md
To ensure you apply the correct update package to your dev kit, you must first d
Using the **model** and **swVersion** identified in the previous section, check the table below to determine which update package to download.
-|model |swVersion |Update method |Download links |
-|||||
-|PE-101 |2020.108.101.105, <br>2020.108.114.120, <br>2020.109.101.122, <br>2020.109.116.120, <br>2021.101.106.118 |**USB only** |[USB update package](https://go.microsoft.com/fwlink/?linkid=2155734) |
-|PE-101 |2021.102.108.112, <br> |OTA or USB |[OTA manifest](https://go.microsoft.com/fwlink/?linkid=2155625)<br>[OTA update package](https://go.microsoft.com/fwlink/?linkid=2161538)<br>[USB update package](https://go.microsoft.com/fwlink/?linkid=2155734) |
-|APDK-101 |All swVersions |OTA or USB | [OTA manifest](https://go.microsoft.com/fwlink/?linkid=2162292)<br>[OTA update package](https://go.microsoft.com/fwlink/?linkid=2161538)<br>[USB update package](https://go.microsoft.com/fwlink/?linkid=2155734) |
-
+|model |swVersion |Update method |Download links |Note |
+||||||
+|PE-101 |2020.108.101.105, <br>2020.108.114.120, <br>2020.109.101.122, <br>2020.109.116.120, <br>2021.101.106.118 |**USB only** |[2021.104.110.103 USB update package](https://go.microsoft.com/fwlink/?linkid=2155734) |Public Preview major release |
+|PE-101 |2021.102.108.112, <br> |OTA or USB |[2021.104.110.103 OTA manifest](https://go.microsoft.com/fwlink/?linkid=2155625)<br>[2021.104.110.103 OTA update package](https://go.microsoft.com/fwlink/?linkid=2161538)<br>[2021.104.110.103 USB update package](https://go.microsoft.com/fwlink/?linkid=2155734) |Public Preview major release |
+|APDK-101 |All swVersions |OTA or USB | [2021.105.111.112 OTA manifest](https://go.microsoft.com/fwlink/?linkid=2163554)<br>[2021.105.111.112 OTA update package](https://go.microsoft.com/fwlink/?linkid=2163456)<br>[2021.105.111.112 USB update package](https://go.microsoft.com/fwlink/?linkid=2163555) |Latest monthly release (May) |
## Next steps Update your dev kits via the methods and update packages determined in the previous section. - [Update your Azure Percept DK over-the-air](https://docs.microsoft.com/azure/azure-percept/how-to-update-over-the-air) - [Update your Azure Percept DK via USB](https://docs.microsoft.com/azure/azure-percept/how-to-update-via-usb)-
azure-resource-manager Reference Createuidefinition Artifact https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/managed-applications/reference-createuidefinition-artifact.md
The following JSON shows an example of *createUiDefinition.json* file for Azure
{ "name": "zipFileBlobUri", "type": "Microsoft.Common.TextBox",
- "defaultValue": "https://github.com/Azure/azure-quickstart-templates/tree/master/101-custom-rp-with-function/artifacts/functionzip/functionpackage.zip",
+ "defaultValue": "https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.customproviders/custom-rp-with-function/artifacts/functionzip/functionpackage.zip",
"label": "The Uri to the uploaded function zip file", "toolTip": "The Uri to the uploaded function zip file", "visible": true
azure-resource-manager Reference Main Template Artifact https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/managed-applications/reference-main-template-artifact.md
The following JSON shows an example of *mainTemplate.json* file for Azure Manage
}, "zipFileBlobUri": { "type": "string",
- "defaultValue": "https://github.com/Azure/azure-quickstart-templates/tree/master/101-custom-rp-with-function/artifacts/functionzip/functionpackage.zip",
+ "defaultValue": "https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.customproviders/custom-rp-with-function/artifacts/functionzip/functionpackage.zip",
"metadata": { "description": "The Uri to the uploaded function zip file" }
azure-resource-manager Tutorial Create Managed App With Custom Provider https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/managed-applications/tutorial-create-managed-app-with-custom-provider.md
The user interface definition for creating a managed application instance includ
{ "name": "zipFileBlobUri", "type": "Microsoft.Common.TextBox",
- "defaultValue": "https://github.com/Azure/azure-quickstart-templates/tree/master/101-custom-rp-with-function/artifacts/functionzip/functionpackage.zip",
+ "defaultValue": "https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.customproviders/custom-rp-with-function/artifacts/functionzip/functionpackage.zip",
"label": "The Uri to the uploaded function zip file", "toolTip": "The Uri to the uploaded function zip file", "visible": true
azure-resource-manager Azure Subscription Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/azure-subscription-service-limits.md
The latest values for Azure Purview quotas can be found in the [Azure Purview qu
For SQL Database limits, see [SQL Database resource limits for single databases](../../azure-sql/database/resource-limits-vcore-single-databases.md), [SQL Database resource limits for elastic pools and pooled databases](../../azure-sql/database/resource-limits-vcore-elastic-pools.md), and [SQL Database resource limits for SQL Managed Instance](../../azure-sql/managed-instance/resource-limits.md).
+The maximum number of private endpoints per Azure SQL Database logical server is 250.
+ ## Azure Synapse Analytics limits [!INCLUDE [synapse-analytics-limits](../../../includes/synapse-analytics-limits.md)]
azure-resource-manager Bicep Decompile https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/bicep-decompile.md
Title: Convert templates between JSON and Bicep description: Describes commands for converting Azure Resource Manager templates from Bicep to JSON and from JSON to Bicep. Previously updated : 03/12/2021 Last updated : 05/17/2021 # Converting ARM templates between JSON and Bicep
The conversion commands produce templates that are functionally equivalent. Howe
## Convert from JSON to Bicep
-The Bicep CLI provides a command to decompile any existing JSON template to a Bicep file. To decompile a JSON file, use:
+The Bicep CLI provides a command to decompile any existing JSON template to a Bicep file.
+
+If you have Azure CLI version 2.20.0 or later, the Bicep CLI is automatically installed. To decompile a JSON file, use:
```azurecli
+az bicep decompile mainTemplate.json
+```
+
+If you don't have a recent version of Azure CLI, and have instead installed the Bicep CLI manually, use:
+
+```bash
bicep decompile mainTemplate.json ```
This command provides a starting point for Bicep authoring. The command doesn't
## Convert from Bicep to JSON
-The Bicep CLI also provides a command to convert Bicep to JSON. To build a JSON file, use:
+The Bicep CLI also provides a command to convert Bicep to JSON.
+
+For Azure CLI version 2.20.0 or later, use the following command to build a JSON file:
```azurecli
+az bicep build mainTemplate.bicep
+```
+
+If you've installed the Bicep CLI manually, use:
+
+```bash
bicep build mainTemplate.bicep ```
You can export the template for a resource group, and then pass it directly to t
```azurecli az group export --name "your_resource_group_name" > main.json
-bicep decompile main.json
+az bicep decompile main.json
``` # [PowerShell](#tab/azure-powershell)
azure-resource-manager Bicep Tutorial Quickstart Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/bicep-tutorial-quickstart-template.md
Currently, the Azure Quickstart templates only provide JSON templates. There are
1. Open [Azure Quickstart templates](https://azure.microsoft.com/resources/templates/) 1. In **Search**, enter _deploy linux web app_.
-1. Select the tile with the title **Deploy a basic Linux web app**. If you have trouble finding it, here's the [direct link](https://azure.microsoft.com/resources/templates/101-webapp-basic-linux/).
+1. Select the tile with the title **Deploy a basic Linux web app**. If you have trouble finding it, here's the [direct link](https://azure.microsoft.com/resources/templates/webapp-basic-linux/).
1. Select **Browse on GitHub**. 1. Select _azuredeploy.json_. This is the template you can use. 1. Select **Raw**, and then make a copy of the URL.
azure-resource-manager Template Outputs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/template-outputs.md
Title: Outputs in templates description: Describes how to define output values in an Azure Resource Manager template (ARM template) and Bicep file. Previously updated : 02/19/2021 Last updated : 05/18/2021 # Outputs in ARM templates
In JSON, add the `copy` element to iterate an output.
# [Bicep](#tab/bicep)
-Iterative output isn't currently available for Bicep.
+In Bicep, add a `for` expression that defines the conditions for the dynamic output. The following example iterates over a range of integers. You can also iterate over an array.
+
+```bicep
+output storageEndpoints array = [for i in range(0, storageCount): reference(${i}${baseName_var}).primaryEndpoints.blob]
+```
azure-resource-manager Test Cases https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/test-cases.md
Title: Test cases for test toolkit description: Describes the tests that are run by the ARM template test toolkit. Previously updated : 04/12/2021 Last updated : 05/17/2021
The schema property in the template must be set to one of the following schemas:
* `https://schema.management.azure.com/schemas/2019-08-01/tenantDeploymentTemplate.json#` * `https://schema.management.azure.com/schemas/2019-08-01/managementGroupDeploymentTemplate.json`
-## Parameters must exist
-
-Test name: **Parameters Property Must Exist**
-
-Your template should have a parameters element. Parameters are essential for making your templates reusable in different environments. Add parameters to your template for values that change when deploying to different environments.
-
-The following example **passes** this test:
-
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "vmName": {
- "type": "string",
- "defaultValue": "linux-vm",
- "metadata": {
- "description": "Name for the Virtual Machine."
- }
- }
- },
- ...
-```
- ## Declared parameters must be used Test name: **Parameters Must Be Referenced**
azure-signalr Authenticate Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-signalr/authenticate-managed-identity.md
After you've determined the appropriate scope for a role assignment, navigate to
You can follow similar steps to assign a role scoped to resource group, or subscription. Once you define the role and its scope, you can test this behavior with samples [in this GitHub location](https://github.com/Azure/azure-event-hubs/tree/master/samples/DotNet/Microsoft.Azure.EventHubs/Rbac).
-## Samples code while configuring your app server
+## Configure your app
+### App server
Add following options when `AddAzureSignalR`:
services.AddSignalR().AddAzureSignalR(option =>
}); ```
+### Azure Functions App
+
+On Azure portal, add an application setting with name `AzureSignalRConnectionString` and value `Endpoint=https://<name>.signalr.net;AuthType=aad;`.
+
+On local, in your `local.appsettings.json` file, add in the `Values` section:
+```json
+{
+ "Values": {
+ "AzureSignalRConnectionString": "Endpoint=https://<name>.signalr.net;AuthType=aad;"
+ }
+}
+```
+ ## Next steps - To learn more about RBAC, see [What is Azure role-based access control (Azure RBAC)](../role-based-access-control/overview.md)? - To learn how to assign and manage Azure role assignments with Azure PowerShell, Azure CLI, or the REST API, see these articles:
services.AddSignalR().AddAzureSignalR(option =>
See the following related articles: - [Authenticate an application with Azure Active Directory to access Azure SignalR Service](authenticate-application.md)-- [Authorize access to Azure SignalR Service using Azure Active Directory](authorize-access-azure-active-directory.md)
+- [Authorize access to Azure SignalR Service using Azure Active Directory](authorize-access-azure-active-directory.md)
azure-sql Authentication Aad Directory Readers Role Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/authentication-aad-directory-readers-role-tutorial.md
Assigning the **Directory Readers** role to the server identity isn't required f
## Directory Readers role assignment using PowerShell > [!IMPORTANT]
-> A [Global Administrator](../../active-directory/roles/permissions-reference.md#global-administrator) or [Privileged Role Administrator](../../active-directory/roles/permissions-reference.md#privileged-role-administrator) will need to run these initial steps. In addition to PowerShell, Azure AD offers Microsoft Graph API to [Create a role-assignable group in Azure AD](../../active-directory/roles/groups-create-eligible.md#using-microsoft-graph-api).
+> A [Global Administrator](../../active-directory/roles/permissions-reference.md#global-administrator) or [Privileged Role Administrator](../../active-directory/roles/permissions-reference.md#privileged-role-administrator) will need to run these initial steps. In addition to PowerShell, Azure AD offers Microsoft Graph API to [Create a role-assignable group in Azure AD](../../active-directory/roles/groups-create-eligible.md#microsoft-graph-api).
1. Download the Azure AD Preview PowerShell module using the following commands. You may need to run PowerShell as an administrator.
azure-sql Connectivity Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/connectivity-architecture.md
Periodically, we will retire Gateways using old hardware and migrate the traffic
| Australia Central | 20.36.105.0, 20.36.104.6, 20.36.104.7 | 20.36.105.32/29 | | Australia Central 2 | 20.36.113.0, 20.36.112.6 | 20.36.113.32/29 | | Australia East | 13.75.149.87, 40.79.161.1, 13.70.112.9 | 13.70.112.32/29, 40.79.160.32/29, 40.79.168.32/29 |
-| Australia South East | 191.239.192.109, 13.73.109.251, 13.77.48.10, 13.77.49.32 | 13.77.49.32/29 |
+| Australia Southeast | 191.239.192.109, 13.73.109.251, 13.77.48.10, 13.77.49.32 | 13.77.49.32/29 |
| Brazil South | 191.233.200.14, 191.234.144.16, 191.234.152.3 | 191.233.200.32/29, 191.234.144.32/29 | | Canada Central | 40.85.224.249, 52.246.152.0, 20.38.144.1 | 13.71.168.32/29, 20.38.144.32/29, 52.246.152.32/29 | | Canada East | 40.86.226.166, 52.242.30.154, 40.69.105.9 , 40.69.105.10 | 40.69.105.32/29|
Periodically, we will retire Gateways using old hardware and migrate the traffic
| France Central | 40.79.137.0, 40.79.129.1, 40.79.137.8, 40.79.145.12 | 40.79.136.32/29, 40.79.144.32/29 | | France South | 40.79.177.0, 40.79.177.10 ,40.79.177.12 | 40.79.176.40/29, 40.79.177.32/29 | | Germany West Central | 51.116.240.0, 51.116.248.0, 51.116.152.0 | 51.116.152.32/29, 51.116.240.32/29, 51.116.248.32/29 |
-| India Central | 104.211.96.159, 104.211.86.30 , 104.211.86.31 | 104.211.86.32/29, 20.192.96.32/29 |
-| India South | 104.211.224.146 | 40.78.192.32/29, 40.78.193.32/29 |
-| India West | 104.211.160.80, 104.211.144.4 | 104.211.144.32/29, 104.211.145.32/29 |
+| Central India | 104.211.96.159, 104.211.86.30 , 104.211.86.31 | 104.211.86.32/29, 20.192.96.32/29 |
+| South India | 104.211.224.146 | 40.78.192.32/29, 40.78.193.32/29 |
+| West India | 104.211.160.80, 104.211.144.4 | 104.211.144.32/29, 104.211.145.32/29 |
| Japan East | 13.78.61.196, 40.79.184.8, 13.78.106.224, 40.79.192.5, 13.78.104.32 | 13.78.104.32/29, 40.79.184.32/29, 40.79.192.32/29 | | Japan West | 104.214.148.156, 40.74.100.192, 40.74.97.10 | 40.74.96.32/29 | | Korea Central | 52.231.32.42, 52.231.17.22 ,52.231.17.23, 20.44.24.32, 20.194.64.33 | 20.194.64.32/29,20.44.24.32/29, 52.231.16.32/29 |
Periodically, we will retire Gateways using old hardware and migrate the traffic
| West Europe | 40.68.37.158, 104.40.168.105, 52.236.184.163 | 104.40.169.32/29, 13.69.112.168/29, 52.236.184.32/29 | | West US | 104.42.238.205, 13.86.216.196 | 13.86.217.224/29 | | West US 2 | 13.66.226.202, 40.78.240.8, 40.78.248.10 | 13.66.136.192/29, 40.78.240.192/29, 40.78.248.192/29 |
-| West US 2 | 13.66.226.202, 40.78.240.8, 40.78.248.10 | 20.150.168.32/29, 20.150.176.32/29, 20.150.184.32/29 |
| | | | ## Next steps
azure-sql Features Comparison https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/features-comparison.md
Previously updated : 03/08/2021 Last updated : 05/18/2021 # Features comparison: Azure SQL Database and Azure SQL Managed Instance
The following table lists the major features of SQL Server and provides informat
| [Distributed transactions - MS DTC](/sql/relational-databases/native-client-ole-db-transactions/supporting-distributed-transactions) | No - see [Elastic transactions](elastic-transactions-overview.md) | No - see [Linked server differences](../managed-instance/transact-sql-tsql-differences-sql-server.md#linked-servers). Try to consolidate databases from several distributed SQL Server instances into one SQL Managed Instance during migration. | | [DML triggers](/sql/relational-databases/triggers/create-dml-triggers) | Most - see individual statements | Yes | | [DMVs](/sql/relational-databases/system-dynamic-management-views/system-dynamic-management-views) | Most - see individual DMVs | Yes - see [T-SQL differences](../managed-instance/transact-sql-tsql-differences-sql-server.md) |
-| [Elastic query](elastic-query-overview.md) (in public preview) | Yes, with required RDBMS type. | Yes, with required RDBMS type. |
+| [Elastic query](elastic-query-overview.md) (in public preview) | Yes, with required RDBMS type. | No |
| [Event notifications](/sql/relational-databases/service-broker/event-notifications) | No - see [Alerts](alerts-insights-configure-portal.md) | No | | [Expressions](/sql/t-sql/language-elements/expressions-transact-sql) |Yes | Yes | | [Extended events (XEvent)](/sql/relational-databases/extended-events/extended-events) | Some - see [Extended events in SQL Database](xevent-db-diff-from-svr.md) | Yes - see [Extended events differences](../managed-instance/transact-sql-tsql-differences-sql-server.md#extended-events) |
azure-sql Sql Database Vulnerability Assessment Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/sql-database-vulnerability-assessment-storage.md
To find out which storage account is being used, go to your **SQL server** pane
:::image type="content" source="../database/media/azure-defender-for-sql/va-storage.png" alt-text="set up vulnerability assessment":::
+> [!NOTE]
+> You can set up email alerts to notify users in your organization to view or access the scan reports. To do this, ensure that you have SQL Security Manager and Storage Blob Data Reader permissions.
+ ## Store VA scan results for Azure SQL Managed Instance in a storage account that can be accessed behind a firewall or VNet Since Managed Instance is not a trusted Microsoft Service and has a different VNet from the storage account, executing a VA scan will result in an error.
azure-sql Auditing Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/auditing-configure.md
The following section describes the configuration of auditing on your managed in
1. Click **OK** in the **Create Audit** dialog.
- 1. <a id="createspec"></a>After you configure the blob container as target for the audit logs, create and enable a server audit specification or database audit specification as you would for SQL Server:
+ > [!NOTE]
+ > When using SQL Server Management Studio UI to create audit, a credential to the container with SAS key will be automatically created.
+
+ 1. <a id="createspec"></a>After you configure the blob container as target for the audit logs, create and enable a server audit specification or database audit specification as you would for SQL Server:
- [Create server audit specification T-SQL guide](/sql/t-sql/statements/create-server-audit-specification-transact-sql) - [Create database audit specification T-SQL guide](/sql/t-sql/statements/create-database-audit-specification-transact-sql)
azure-sql Transact Sql Tsql Differences Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/transact-sql-tsql-differences-sql-server.md
Operations:
### PolyBase
-The only available type of external source is RDBMS (in public preview) to Azure SQL database, Azure SQL managed instance, and Azure Synapse pool. You can use [an external table that references a serverless SQL pool in Synapse Analytics](https://devblogs.microsoft.com/azure-sql/read-azure-storage-files-using-synapse-sql-external-tables/) as a workaround for Polybase external tables that directly reads from the Azure storage.
-In Azure SQL managed instance you can use linked servers to [a serverless SQL pool in Synapse Analytics](https://devblogs.microsoft.com/azure-sql/linked-server-to-synapse-sql-to-implement-polybase-like-scenarios-in-managed-instance/) or SQL Server to read Azure storage data.
-For information about PolyBase, see [PolyBase](/sql/relational-databases/polybase/polybase-guide).
+Work on enabling Polybase support in SQL Managed Instance is [in progress](https://feedback.azure.com/forums/915676-sql-managed-instance/suggestions/35698078-enable-polybase-on-sql-managed-instance). In the meantime, as a workaroiund you can use linked servers to [a serverless SQL pool in Synapse Analytics](https://devblogs.microsoft.com/azure-sql/linked-server-to-synapse-sql-to-implement-polybase-like-scenarios-in-managed-instance/) or SQL Server to query data from files stored in Azure Data Lake or Azure Storage.
+For general information about PolyBase, see [PolyBase](/sql/relational-databases/polybase/polybase-guide).
### Replication
backup Backup Azure Vms Encryption https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-vms-encryption.md
Title: Back up and restore encrypted Azure VMs description: Describes how to back up and restore encrypted Azure VMs with the Azure Backup service. Previously updated : 08/18/2020 Last updated : 05/17/2021 # Back up and restore encrypted Azure virtual machines
Restore encrypted VMs as follows:
1. [Restore the VM disk](backup-azure-arm-restore-vms.md#restore-disks). > [!NOTE]
- > After you restore the VM disk, swap the OS disk of the original VM with the restored VM disk without re-creating it. [Learn more](https://azure.microsoft.com/blog/os-disk-swap-managed-disks/).
+ > After you restore the VM disk, you can manually swap the OS disk of the original VM with the restored VM disk without re-creating it. [Learn more](https://azure.microsoft.com/blog/os-disk-swap-managed-disks/).
2. Recreate the virtual machine instance by doing one of the following actions: 1. Use the template that's generated during the restore operation to customize VM settings, and trigger VM deployment. [Learn more](backup-azure-arm-restore-vms.md#use-templates-to-customize-a-restored-vm).
- 2. Create a new VM from the restored disks using PowerShell. [Learn more](backup-azure-vms-automation.md#create-a-vm-from-restored-disks).
-3. For Linux VMs, reinstall the ADE extension so the data disks are open and mounted.
+ >[!NOTE]
+ >While deploying the template, verify the storage account containers and the public/private settings.
+ 1. Create a new VM from the restored disks using PowerShell. [Learn more](backup-azure-vms-automation.md#create-a-vm-from-restored-disks).
+1. For Linux VMs, reinstall the ADE extension so the data disks are open and mounted.
## Next steps
backup Backup Support Matrix Iaas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-support-matrix-iaas.md
Title: Support matrix for Azure VM backup description: Provides a summary of support settings and limitations when backing up Azure VMs with the Azure Backup service. Previously updated : 04/21/2021 Last updated : 05/17/2021
Restore of Zone-pinned VMs | Supported (for a VM that's backed-up after Jan 2019
Gen2 VMs | Supported <br> Azure Backup supports backup and restore of [Gen2 VMs](https://azure.microsoft.com/updates/generation-2-virtual-machines-in-azure-public-preview/). When these VMs are restored from Recovery point, they're restored as [Gen2 VMs](https://azure.microsoft.com/updates/generation-2-virtual-machines-in-azure-public-preview/). Backup of Azure VMs with locks | Unsupported for unmanaged VMs. <br><br> Supported for managed VMs. [Spot VMs](../virtual-machines/spot-vms.md) | Unsupported. Azure Backup restores Spot VMs as regular Azure VMs.
-[Azure Dedicated Host](../virtual-machines/dedicated-hosts.md) | Supported<br></br>While restoring an Azure VM through the [Create New](backup-azure-arm-restore-vms.md#create-a-vm) option, though the restore gets successful, Azure VM can't be restored in the dedicated host. To achieve this, we recommend you to restore as disks. While [restoring as disks](backup-azure-arm-restore-vms.md#restore-disks) with the template, create a VM in dedicated host, and then attach the disks.<br></br>This is also applicable in secondary region, while performing [Cross Region Restore](backup-azure-arm-restore-vms.md#cross-region-restore).
+[Azure Dedicated Host](../virtual-machines/dedicated-hosts.md) | Supported<br></br>While restoring an Azure VM through the [Create New](backup-azure-arm-restore-vms.md#create-a-vm) option, though the restore gets successful, Azure VM can't be restored in the dedicated host. To achieve this, we recommend you to restore as disks. While [restoring as disks](backup-azure-arm-restore-vms.md#restore-disks) with the template, create a VM in dedicated host, and then attach the disks.<br></br>This is not applicable in secondary region, while performing [Cross Region Restore](backup-azure-arm-restore-vms.md#cross-region-restore).
Windows Storage Spaces configuration of standalone Azure VMs | Supported [Azure VM Scale Sets](../virtual-machine-scale-sets/virtual-machine-scale-sets-orchestration-modes.md#scale-sets-with-flexible-orchestration) | Supported for both uniform and flexible orchestration models to back up and restore Single Azure VM.
cognitive-services Overview Multivariate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Anomaly-Detector/overview-multivariate.md
See the following technical documents for information about the algorithms used:
* Paper: [Multivariate time series Anomaly Detection via Graph Attention Network](https://arxiv.org/abs/2009.02040)
-> [!VIDEO https://www.youtube.com/watch?v=FwuI02edclQ]
+> [!VIDEO https://www.youtube.com/embed/FwuI02edclQ]
## Join the Anomaly Detector community
See the following technical documents for information about the algorithms used:
## Next steps - [Quickstarts](./quickstarts/client-libraries-multivariate.md).-- [Best Practices](./concepts/best-practices-multivariate.md): This article is about recommended patterns to use with the multivariate APIs.
+- [Best Practices](./concepts/best-practices-multivariate.md): This article is about recommended patterns to use with the multivariate APIs.
cognitive-services Spatial Analysis Operations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/spatial-analysis-operations.md
These are the parameters required by each of these Spatial Analysis operations.
| VIDEO_DECODE_GPU_INDEX| Which GPU to decode the video frame. By default it is 0. Should be the same as the `gpu_index` in other node config like `VICA_NODE_CONFIG`, `DETECTOR_NODE_CONFIG`.| | INPUT_VIDEO_WIDTH | Input video/stream's frame width (e.g. 1920). This is an optional field and if provided, the frame will be scaled to this dimension while preserving the aspect ratio.| | DETECTOR_NODE_CONFIG | JSON indicating which GPU to run the detector node on. It should be in the following format: `"{ \"gpu_index\": 0 }",`|
+| CAMERA_CONFIG | JSON indicating the calibrated camera parameters for multiple cameras. If the skill you used requires calibration and you already have the camera parameter, you can use this config to provide them directly. Should be in the following format: `"{ \"cameras\": [{\"source_id\": \"endcomputer.0.persondistancegraph.detector+end_computer1\", \"camera_height\": 13.105561256408691, \"camera_focal_length\": 297.60003662109375, \"camera_tiltup_angle\": 0.9738943576812744}] }"`, the `source_id` is used to identify each camera. It can be get from the `source_info` of the event we published. It will only take effect when `do_calibration=false` in `DETECTOR_NODE_CONFIG`.|
+| TRACKER_NODE_CONFIG | JSON indicating whether to compute speed in the tracker node or not. It should be in the following format: `"{ \"enable_speed\": false }",`|
| SPACEANALYTICS_CONFIG | JSON configuration for zone and line as outlined below.| | ENABLE_FACE_MASK_CLASSIFIER | `True` to enable detecting people wearing face masks in the video stream, `False` to disable it. By default this is disabled. Face mask detection requires input video width parameter to be 1920 `"INPUT_VIDEO_WIDTH": 1920`. The face mask attribute will not be returned if detected people are not facing the camera or are too far from it. Refer to the [camera placement](spatial-analysis-camera-placement.md) guide for more information |
This is an example of the DETECTOR_NODE_CONFIG parameters for all Spatial Analys
| `enable_breakpad`| bool | Indicates whether you want to enable breakpad, which is used to generate crash dump for debug use. It is `false` by default. If you set it to `true`, you also need to add `"CapAdd": ["SYS_PTRACE"]` in the `HostConfig` part of container `createOptions`. By default, the crash dump is uploaded to the [RealTimePersonTracking](https://appcenter.ms/orgs/Microsoft-Organization/apps/RealTimePersonTracking/crashes/errors?version=&appBuild=&period=last90Days&status=&errorType=all&sortCol=lastError&sortDir=desc) AppCenter app, if you want the crash dumps to be uploaded to your own AppCenter app, you can override the environment variable `RTPT_APPCENTER_APP_SECRET` with your app's app secret. | `enable_orientation` | bool | Indicates whether you want to compute the orientation for the detected people or not. `enable_orientation` is set by default to False. | +
+### Speed Parameter Settings
+You can configure the speed computation through the tracker node parameter settings.
+```
+{
+"enable_speed": true,
+}
+```
+| Name | Type| Description|
+||||
+| `enable_speed` | bool | Indicates whether you want to compute the speed for the detected people or not. `enable_speed` is set by default to false. It is highly recommended that we enable both speed and orientation to have the best estimated values |
++ ## Spatial Analysis operations configuration and output ### Zone configuration for cognitiveservices.vision.spatialanalysis-personcount
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/whats-new.md
Learn what's new in the service. These items may be release notes, videos, blog posts, and other types of information. Bookmark this page to stay up to date with the service.
+## May 2021
+
+### Spatial Analysis container update
+
+A new version of the [Spatial Analysis container](spatial-analysis-container.md) has been released with a new feature set. This Docker container lets you analyze real-time streaming video to understand spatial relationships between people and their movement through physical environments.
+
+* [Spatial Analysis operations](spatial-analysis-operations.md) can be now configured to detect the orientation that a person is facing.
+ * An orientation classifier can be enabled for the `personcrossingline` and `personcrossingpolygon` operations by configuring the `enable_orientation` parameter. It is set to off by default.
+
+* [Spatial Analysis operations](spatial-analysis-operations.md) now also offers configuration to detect a person's speed while walking/running
+ * Speed can be detected for the `personcrossingline` and `personcrossingpolygon` operations by turning on the 'enable_speed` classifier which is off by default. The output is reflected in the 'speed', 'avgSpeed', and 'minSpeed' outputs.
++ ## April 2021 ### Computer Vision v3.2 GA
Follow an [Extract text quickstart](https://github.com/Azure-Samples/cognitive-s
## Cognitive Service updates
-[Azure update announcements for Cognitive Services](https://azure.microsoft.com/updates/?product=cognitive-services)
+[Azure update announcements for Cognitive Services](https://azure.microsoft.com/updates/?product=cognitive-services)
cognitive-services Copy Move Projects https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Custom-Vision-Service/copy-move-projects.md
After you've created and trained a Custom Vision project, you may want to copy y
The **[ExportProject](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)** and **[ImportProject](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc7548b571998fddee3)** APIs enable this scenario by allowing you to copy projects from one Custom Vision account into others. This guide shows you how to use these REST APIs with cURL. You can also use an HTTP request service like Postman to issue the requests.
+> [!TIP]
+> For an example of this scenario using the Python client library, see the [Move Custom Vision Project](https://github.com/Azure-Samples/custom-vision-move-project/tree/master/) repository on GitHub.
+ ## Business scenarios If your app or business depends on the use of a Custom Vision project, we recommend you copy your model to another Custom Vision account in another region. Then if a regional outage occurs, you can access your project in the region where it was copied.
cognitive-services Export Your Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Custom-Vision-Service/export-your-model.md
Custom Vision Service allows classifiers to be exported to run offline. You can
Custom Vision Service supports the following exports: * __Tensorflow__ for __Android__.
+* **TensorflowJS** for JavaScript frameworks like React, Angular, and Vue. This will run on both **Android** and **iOS** devices.
* __CoreML__ for __iOS11__.
-* __ONNX__ for __Windows ML__.
+* __ONNX__ for __Windows ML__, **Android**, and **iOS**.
* __[Vision AI Developer Kit](https://azure.github.io/Vision-AI-DevKit-Pages/)__. * A __Docker container__ for Windows, Linux, or ARM architecture. The container includes a Tensorflow model and service code to use the Custom Vision API.
cognitive-services Limits And Quotas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Custom-Vision-Service/limits-and-quotas.md
Previously updated : 03/25/2019 Last updated : 05/13/2021
There are two tiers of keys for the Custom Vision service. You can sign up for a
The number of training images per project and tags per project are expected to increase over time for S0 projects.
-|Factor|**F0**|**S0**|
+|Factor|**F0 (free)**|**S0 (standard)**|
|--|--|--| |Projects|2|100| |Training images per project |5,000|100,000|
cognitive-services Storage Integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Custom-Vision-Service/storage-integration.md
This guide shows you how to use these REST APIs with cURL. You can also use an H
## Prerequisites - A Custom Vision resource in Azure. If you don't have one, go to the Azure portal and [create a new Custom Vision resource](https://portal.azure.com/?microsoft_azure_marketplace_ItemHideKey=microsoft_azure_cognitiveservices_customvision#create/Microsoft.CognitiveServicesCustomVision?azure-portal=true). This feature doesn't currently support the Cognitive Service resource (all in one key).-- An Azure Storage account with a blob container. Follow [Exercises 1 of the Azure Storage Lab](https://github.com/Microsoft/computerscience/blob/master/Labs/Azure%20Services/Azure%20Storage/Azure%20Storage%20and%20Cognitive%20Services%20(MVC).md#Exercise1) if you need help with this step.
-* [PowerShell version 6.0+](/powershell/scripting/install/installing-powershell-core-on-windows), or a similar command-line application.
+- An Azure Storage account with a blob container. Follow the [Storage quickstart](https://docs.microsoft.com/azure/storage/blobs/storage-quickstart-blobs-portal) if you need help with this step.
+- [PowerShell version 6.0+](/powershell/scripting/install/installing-powershell-core-on-windows), or a similar command-line application.
## Set up Azure storage integration
The `"exportStatus"` field may be either `"ExportCompleted"` or `"ExportFailed"`
In this guide, you learned how to copy and move a project between Custom Vision resources. Next, explore the API reference docs to see what else you can do with Custom Vision. * [REST API reference documentation (training)](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)
-* [REST API reference documentation (prediction)](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Prediction_3.1/operations/5eb37d24548b571998fde5f3)
+* [REST API reference documentation (prediction)](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Prediction_3.1/operations/5eb37d24548b571998fde5f3)
cognitive-services Luis Glossary https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-glossary.md
description: The glossary explains terms that you might encounter as you work wi
Previously updated : 05/08/2020 Last updated : 05/17/2021 # Language understanding glossary of common vocabulary and concepts
The [authoring key](luis-how-to-azure-subscription.md) is used to author the app
### Authoring Resource
-Your LUIS [authoring resource](luis-how-to-azure-subscription.md#azure-resources-for-luis) is a manageable item that is available through Azure. The resource is your access to the associated authoring, training, and publishing abilities of the Azure service. The resource includes authentication, authorization, and security information you need to access the associated Azure service.
+Your LUIS [authoring resource](luis-how-to-azure-subscription.md) is a manageable item that is available through Azure. The resource is your access to the associated authoring, training, and publishing abilities of the Azure service. The resource includes authentication, authorization, and security information you need to access the associated Azure service.
The authoring resource has an Azure "kind" of `LUIS-Authoring`.
cognitive-services Luis How To Azure Subscription https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-how-to-azure-subscription.md
Title: Using authoring and runtime keys - LUIS
-description: When you first use LUIS, you don't need to create an authoring key. When you want to publish the app and then use your runtime endpoint, you need to create and assign the runtime key to the app.
+ Title: How to create and manage LUIS resources
+
+description: Learn how to use and manage Azure resources for LUIS.the app.
+++ Previously updated : 09/07/2020- Last updated : 05/17/2021
-# Create LUIS resources
+# How to create and manage LUIS resources
-Authoring and query prediction runtime resources provide authentication to your Language Understanding (LUIS) app and prediction endpoint.
+Use this article to learn about the types of Azure resources you can use with LUIS, and how to manage them.
-<a name="azure-resources-for-luis"></a>
-<a name="programmatic-key" ></a>
-<a name="endpoint-key"></a>
-<a name="authoring-key"></a>
+## Authoring Resource
-## LUIS resources
+An authoring resource lets you create, manage, train, test, and publish your applications. One [pricing tier](https://azure.microsoft.com/pricing/details/cognitive-services/language-understanding-intelligent-services/) is available for the LUIS authoring resource - the free (F0) tier, which gives you:
-LUIS allows three types of Azure resources and one non-Azure resource:
+* 1 million authoring transactions
+* 1,000 testing prediction endpoint requests per month.
-|Resource|Purpose|Cognitive service `kind`|Cognitive service `type`|
-|--|--|--|--|
-|Authoring resource|Allows you to create, manage, train, test, and publish your applications. [Create a LUIS authoring resource](luis-how-to-azure-subscription.md#create-luis-resources-in-the-azure-portal) if you intend to author LUIS apps programmatically or from the LUIS portal. You need to [migrate your LUIS account](luis-migration-authoring.md#what-is-migration) before you link your Azure authoring resources to your application. You can control permissions to the authoring resource by assigning people [the contributor role](#contributions-from-other-authors). <br><br> One tier is available for the LUIS authoring resource:<br> <ul> <li>**Free F0 authoring resource**, which gives you 1 million free authoring transactions and 1,000 free testing prediction endpoint requests monthly. |`LUIS.Authoring`|`Cognitive Services`|
-|Prediction resource| After you publish your LUIS application, use the prediction resource/key to query prediction endpoint requests. Create a LUIS prediction resource before your client app requests predictions beyond the 1,000 requests provided by the authoring or starter resource. <br><br> Two tiers are available for the prediction resource:<br><ul> <li> **Free F0 prediction resource**, which gives you 10,000 free prediction endpoint requests monthly.<br> <li> **Standard S0 prediction resource**, which is the paid tier. [Learn more about pricing.](https://azure.microsoft.com/pricing/details/cognitive-services/language-understanding-intelligent-services/)|`LUIS`|`Cognitive Services`|
-|Starter/Trial resource|Allows you to create, manage, train, test, and publish your applications. This resource is created by default if you choose the starter resource option when you first sign in to LUIS. The starter key will eventually be deprecated. All LUIS users will need to [migrate their accounts](luis-migration-authoring.md#what-is-migration) and link their LUIS applications to an authoring resource. Unlike the authoring resource, this resource doesn't give you permissions for Azure role-based access control. <br><br> Like the authoring resource, the starter resource gives you 1 million free authoring transactions and 1,000 free testing prediction endpoint requests.|-|Not an Azure resource.|
-|[Cognitive Services multiservice resource key](../cognitive-services-apis-create-account-cli.md?tabs=windows#create-a-cognitive-services-resource)|Query prediction endpoint requests shared with LUIS and other supported cognitive services.|`CognitiveServices`|`Cognitive Services`|
+You can use the [v3.0-preview LUIS Programmatic APIs](https://westus.dev.cognitive.microsoft.com/docs/services/luis-programmatic-apis-v3-0-preview/operations/5890b47c39e2bb052c5b9c2f) to manage authoring resources.
+## Prediction resource
-> [!Note]
-> LUIS provides two types of F0 (free tier) resources: one for authoring transactions and one for prediction transactions. If you're running out of free quota for prediction transactions, make sure you're using the F0 prediction resource, which gives you a 10,000 free transactions monthly, and not the authoring resource, which gives you 1,000 prediction transactions monthly.
-
-When the Azure resource creation process is finished, [assign the resource](#assign-a-resource-to-an-app) to the app in the LUIS portal.
-
-> [!important]
-> You should author LUIS apps in the [regions](luis-reference-regions.md#publishing-regions) where you want to publish and query.
-
-## Resource ownership
-
-An Azure resource, like a LUIS resource, is owned by the subscription that contains the resource.
-
-To change the ownership of a resource, you can take one of these actions:
-* Transfer the [ownership](../../cost-management-billing/manage/billing-subscription-transfer.md) of your subscription.
-* Export the LUIS app as a file, and then import the app on a different subscription. Export is available on the **My apps** page in the LUIS portal.
-
-## Resource limits
-
-### Authoring key creation limits
-
-You can create as many as 10 authoring keys per region, per subscription. Publishing regions are different from authoring regions. Make sure you create an app in the authoring region that corresponds to the publishing region where you want your client application to be located. For information on how authoring regions map to publishing regions, see [Authoring and publishing regions](luis-reference-regions.md).
-
-For more information on key limits, see [key limits](luis-limits.md#key-limits).
-
-### Errors for key usage limits
+A prediction resource lets you query your prediction endpoint beyond the 1,000 requests provided by the authoring resource. Two [pricing tiers](https://azure.microsoft.com/pricing/details/cognitive-services/language-understanding-intelligent-services/) are available for the prediction resource:
-Usage limits are based on the pricing tier.
-
-If you exceed your transactions-per-second (TPS) quota, you receive an HTTP 429 error. If you exceed your transaction-per-month (TPM) quota, you receive an HTTP 403 error.
--
-### Reset an authoring key
+* The free (F0) prediction resource, which gives you 10,000 prediction endpoint requests monthly.
+* Standard (S0) prediction resource, which is the paid tier.
-For [migrated authoring resource](luis-migration-authoring.md) apps: If your authoring key is compromised, reset the key in the Azure portal, on the **Keys** page for the authoring resource.
-
-For apps that haven't been migrated: The key is reset on all your apps in the LUIS portal. If you author your apps via the authoring APIs, you need to change the value of `Ocp-Apim-Subscription-Key` to the new key.
-
-### Regenerate an Azure key
-
-You can regenerate an Azure key from the **Keys** page in the Azure portal.
--
-<a name="securing-the-endpoint"></a>
-
-## App ownership, access, and security
-
-An app is defined by its Azure resources, which are determined by the owner's subscription.
-
-You can move your LUIS app. Use the following resources to help you do so by using the Azure portal or Azure CLI:
-
-* [Move an app between LUIS authoring resources](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/apps-move-app-to-another-luis-authoring-azure-resource)
-* [Move a resource to a new resource group or subscription](../../azure-resource-manager/management/move-resource-group-and-subscription.md)
-* [Move a resource within the same subscription or across subscriptions](../../azure-resource-manager/management/move-limitations/app-service-move-limitations.md)
--
-### Contributions from other authors
-
-For [migrated authoring resource](luis-migration-authoring.md) apps: You can manage _contributors_ for an authoring resource in the Azure portal by using the **Access control (IAM)** page. Learn [how to add a user](luis-how-to-collaborate.md) by using the collaborator's email address and the contributor role.
-
-For apps that haven't yet migrated: You can manage all _collaborators_ on the **Manage -> Collaborators** page in the LUIS portal.
-
-### Query prediction access for private and public apps
-
-For private apps, query prediction runtime access is available for owners and contributors. For public apps, runtime access is available to users who have their own Azure [Cognitive Service](../cognitive-services-apis-create-account.md) or [LUIS](#create-resources-in-the-azure-portal) runtime resource and the public app's ID.
-
-There isn't currently a catalog of public apps.
-
-### Authoring permissions and access
-Access to an app from the [LUIS](luis-reference-regions.md#luis-website) portal or the [authoring APIs](https://go.microsoft.com/fwlink/?linkid=2092087) is controlled by the Azure authoring resource.
-
-The owner and all contributors have access to author the app.
-
-|Authoring access includes:|Notes|
-|--|--|
-|Add or remove endpoint keys||
-|Export version||
-|Export endpoint logs||
-|Import version||
-|Make app public|When an app is public, anyone who has an authoring or endpoint key can query the app.|
-|Modify model|
-|Publish|
-|Review endpoint utterances for [active learning](luis-how-to-review-endpoint-utterances.md)|
-|Train|
-
-<a name="prediction-endpoint-runtime-key"></a>
-
-### Prediction endpoint runtime access
-
-Access for querying the prediction endpoint is controlled by a setting on the **Application Information** page in the **Manage** section.
-
-|[Private endpoint](#runtime-security-for-private-apps)|[Public endpoint](#runtime-security-for-public-apps)|
-|:--|:--|
-|Available to owner and contributors|Available to owner, contributors, and anyone else who knows the app ID|
-
-You can control who sees your LUIS runtime key by calling it in a server-to-server environment. If you're using LUIS from a bot, the connection between the bot and LUIS is already more secure. If you're calling the LUIS endpoint directly, you should create a server-side API (like an Azure [function](https://azure.microsoft.com/services/functions/)) with controlled access (via something like [Azure AD](https://azure.microsoft.com/services/active-directory/)). When the server-side API is called and authenticated and authorization is verified, pass the call on to LUIS. This strategy doesn't prevent man-in-the-middle attacks. But it does obfuscate your key and endpoint URL from your users, allow you to track access, and allow you to add endpoint response logging (like [Application Insights](https://azure.microsoft.com/services/application-insights/)).
-
-### Runtime security for private apps
-
-A private app's runtime is available only to the following keys:
-
-|Key and user|Explanation|
-|--|--|
-|Owner's authoring key| Up to 1,000 endpoint hits|
-|Collaborator/contributor authoring keys| Up to 1,000 endpoint hits|
-|Any key assigned to LUIS by an author or collaborator/contributor|Based on key usage tier|
-
-### Runtime security for public apps
-
-When your app is configured as public, _any_ valid LUIS authoring key or LUIS endpoint key can query it, as long as the key hasn't used the entire endpoint quota.
-
-A user who isn't an owner or contributor can access a public app's runtime only if given the app ID. LUIS doesn't have a public market or any other way for users to search for a public app.
-
-A public app is published in all regions. So a user with a region-based LUIS resource key can access the app in whichever region is associated with the resource key.
+You can use the [v3.0-preview LUIS Endpoint API](https://westus.dev.cognitive.microsoft.com/docs/services/luis-endpoint-api-v3-0-preview/operations/5f68f4d40a511ce5a7440859) to manage prediction resources.
+> [!Note]
+> * You can also use a [multi-service resource](../cognitive-services-apis-create-account-cli.md?tabs=multiservice) to get a single endpoint you can use for multiple Cognitive Services.
+> * LUIS provides two types of F0 (free tier) resources: one for authoring transactions and one for prediction transactions. If you're running out of free quota for prediction transactions, make sure you're using the F0 prediction resource, which gives you a 10,000 free transactions monthly, and not the authoring resource, which gives you 1,000 prediction transactions monthly.
+> * You should author LUIS apps in the [regions](luis-reference-regions.md#publishing-regions) where you want to publish and query.
-### Control access to your query prediction endpoint
+## Create LUIS resources
-You can control who can see your LUIS prediction runtime endpoint key by calling it in a server-to-server environment. If you're using LUIS from a bot, the connection between the bot and LUIS is already more secure. If you're calling the LUIS endpoint directly, you should create a server-side API (like an Azure [function](https://azure.microsoft.com/services/functions/)) with controlled access (via something like [Azure AD](https://azure.microsoft.com/services/active-directory/)). When the server-side API is called and authentication and authorization are verified, pass the call on to LUIS. This strategy doesn't prevent man-in-the-middle attacks. But it does obfuscate your endpoint from your users, allow you to track access, and allow you to add endpoint response logging (like [Application Insights](https://azure.microsoft.com/services/application-insights/)).
+To create LUIS resources, you can use the LUIS portal, [Azure portal](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesLUISAllInOne), or Azure CLI. After you've created your resources, you will need to assign them to your apps to be used by them.
-<a name="starter-key"></a>
+# [LUIS portal](#tab/portal)
-## Sign in to the LUIS portal and begin authoring
+### Create a LUIS authoring resource using the LUIS portal
-1. Sign in to the [LUIS portal](https://www.luis.ai) and agree to the terms of use.
-1. Start authoring your LUIS app by choosing your Azure LUIS authoring key:
+1. Sign in to the [LUIS portal](https://www.luis.ai), select your country/region and agree to the terms of use. If you see the **My Apps** section in the portal, a LUIS resource already exists and you can skip the next step.
- ![Screenshot that shows the welcome screen.](./media/luis-how-to-azure-subscription/sign-in-create-resource.png)
+2. In the **Choose an authoring** window that appears, find your Azure subscription, and LUIS authoring resource. If you don't have a resource, you can create a new one.
-1. When you're done with the resource selection process, [create a new app](luis-how-to-start-new-app.md#create-new-app-in-luis).
+ :::image type="content" source="./media/luis-how-to-azure-subscription/choose-authoring-resource.png" alt-text="Choose a type of Language Understanding authoring resource.":::
+
+ When you create a new authoring resource, provide the following information:
+ * **Tenant name**: the tenant your Azure subscription is associated with.
+ * **Azure subscription name**: the subscription that will be billed for the resource.
+ * **Azure resource group name**: a custom resource group name you choose or create. Resource groups allow you to group Azure resources for access and management.
+ * **Azure resource name**: a custom name you choose, used as part of the URL for your authoring and prediction endpoint queries.
+ * **Pricing tier**: the pricing tier determines the maximum transaction per second and month.
+### Create a LUIS Prediction resource using the LUIS portal
-<a name="create-azure-resources"></a>
-<a name="create-resources-in-the-azure-portal"></a>
+# [Azure CLI](#tab/cli)
-### Create resources in the Azure CLI
+### Create LUIS resources in the Azure CLI
Use the [Azure CLI](/cli/azure/install-azure-cli) to create each resource individually.
-Resource `kind`:
-
-* Authoring: `LUIS.Authoring`
-* Prediction: `LUIS`
+> [!TIP]
+> * The authoring resource `kind` is `LUIS.Authoring`
+> * The prediction resource `kind` is `LUIS`
1. Sign in to the Azure CLI:
Resource `kind`:
This command opens a browser so you can select the correct account and provide authentication.
-1. Create a LUIS authoring resource of kind `LUIS.Authoring`, named `my-luis-authoring-resource`. Create it in the _existing_ resource group named `my-resource-group` for the `westus` region.
+2. Create a LUIS authoring resource of kind `LUIS.Authoring`, named `my-luis-authoring-resource`. Create it in the _existing_ resource group named `my-resource-group` for the `westus` region.
```azurecli az cognitiveservices account create -n my-luis-authoring-resource -g my-resource-group --kind LUIS.Authoring --sku F0 -l westus --yes ```
-1. Create a LUIS prediction endpoint resource of kind `LUIS`, named `my-luis-prediction-resource`. Create it in the _existing_ resource group named `my-resource-group` for the `westus` region. If you want higher throughput than the free tier provides, change `F0` to `S0`. [Learn more about pricing tiers and throughput.](luis-limits.md#key-limits)
+3. Create a LUIS prediction endpoint resource of kind `LUIS`, named `my-luis-prediction-resource`. Create it in the _existing_ resource group named `my-resource-group` for the `westus` region. If you want higher throughput than the free tier provides, change `F0` to `S0`. [Learn more about pricing tiers and throughput.](luis-limits.md#key-limits)
```azurecli az cognitiveservices account create -n my-luis-prediction-resource -g my-resource-group --kind LUIS --sku F0 -l westus --yes ```
- > [!Note]
- > These keys aren't used by the LUIS portal until they're assigned on the **Manage** > **Azure Resources** page in the LUIS portal.
+++
+## Assign LUIS resources
+
+Creating a resource doesn't necessarily mean that it is put to use, you need to assign it to your apps. You can assign an authoring resource for a single app or for all apps in LUIS.
-<a name="assign-an-authoring-resource-in-the-luis-portal-for-all-apps"></a>
+# [LUIS portal](#tab/portal)
-### Assign resources in the LUIS portal
+### Assign resources using the LUIS portal
-You can assign an authoring resource for a single app or for all apps in LUIS. The following procedure assigns all apps to a single authoring resource.
+**Assign an authoring resource to all your apps**
+
+ The following procedure assigns the authoring resource to all apps.
1. Sign in to the [LUIS portal](https://www.luis.ai). 1. In the upper-right corner, select your user account, and then select **Settings**. 1. On the **User Settings** page, select **Add authoring resource**, and then select an existing authoring resource. Select **Save**.
-## Assign a resource to an app
-
->[!NOTE]
->If you don't have an Azure subscription, you won't be able to assign or create a new resource. You'll need to create an [Azure free account](https://azure.microsoft.com/en-us/free/) and then return to LUIS to create a new resource from the portal.
+**Assign a resource to a specific app**
-You can use this procedure to create an authoring or prediction resource or assign one to an application:
+The following procedure assigns a resource to a specific app.
1. Sign in to the [LUIS portal](https://www.luis.ai). Select an app from the **My apps** list. 1. Go to **Manage** > **Azure Resources**:
- ![Screenshot that shows the Azure Resources page.](./media/luis-how-to-azure-subscription/manage-azure-resources-prediction.png)
+ :::image type="content" source="./media/luis-how-to-azure-subscription/manage-azure-resources-prediction.png" alt-text="Choose a type of Language Understanding prediction resource." lightbox="./media/luis-how-to-azure-subscription/manage-azure-resources-prediction.png":::
1. On the **Prediction resource** or **Authoring resource** tab, select the **Add prediction resource** or **Add authoring resource** button. 1. Use the fields in the form to find the correct resource, and then select **Save**.
-1. If you don't have an existing resource, you can create one by selecting **Create a new LUIS resource?** at the bottom of the window.
+# [Azure CLI](#tab/cli)
-### Assign a query prediction runtime resource without using the LUIS portal
+## Assign prediction resource programmatically
-For automated processes like CI/CD pipelines, you might want to automate the assignment of a LUIS runtime resource to a LUIS app. To do so, complete these steps:
+For automated processes like CI/CD pipelines, you can automate the assignment of a LUIS resource to a LUIS app with the following steps:
-1. Get an Azure Resource Manager token from [this website](https://resources.azure.com/api/token?plaintext=true). This token does expire, so use it right away. The request returns an Azure Resource Manager token.
+1. Get an [Azure Resource Manager token](https://resources.azure.com/api/token?plaintext=true) which is an alphanumeric string of characters. This token does expire, so use it right away. You can also use the following Azure CLI command.
```azurecli az account get-access-token --resource=https://management.core.windows.net/ --query accessToken --output tsv ```
- ![Screenshot that shows the website for requesting an Azure Resource Manager token.](./media/luis-manage-keys/get-arm-token.png)
-
-1. Use the token to request the LUIS runtime resources across subscriptions. Use the [Get LUIS Azure accounts API](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5be313cec181ae720aa2b26c), which your user account has access to.
+1. Use the token to request the LUIS runtime resources across subscriptions. Use the API to [get the LUIS Azure account](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5be313cec181ae720aa2b26c) that your user account has access to.
This POST API requires the following values:
For automated processes like CI/CD pipelines, you might want to automate the ass
|Header|`Ocp-Apim-Subscription-Key`|Your authoring key.| |Header|`Content-type`|`application/json`| |Querystring|`appid`|The LUIS app ID.
- |Body||{"AzureSubscriptionId":"ddda2925-af7f-4b05-9ba1-2155c5fe8a8e",<br>"ResourceGroup": "resourcegroup-2",<br>"AccountName": "luis-uswest-S0-2"}|
+ |Body||{`AzureSubscriptionId`: Your Subscription ID,<br>`ResourceGroup`: Resource Group name that has your prediction resource,<br>`AccountName`: Name of your prediction resource}|
When this API is successful, it returns `201 - created status`. ++ ## Unassign a resource
+When you unassign a resource, it's not deleted from Azure. It's only unlinked from LUIS.
+
+# [LUIS portal](#tab/portal)
+
+## Unassign resources using LUIS portal
+ 1. Sign in to the [LUIS portal](https://www.luis.ai), and then select an app from the **My apps** list. 1. Go to **Manage** > **Azure Resources**.
-1. On the **Prediction resource** or **Authoring resource** tab, select the **Unassign resource** button for the resource.
+1. Select the **Unassign resource** button for the resource.
-When you unassign a resource, it's not deleted from Azure. It's only unlinked from LUIS.
+# [Azure CLI](#tab/cli)
+## Unassign prediction resource programmatically
-## Delete an account
+1. Get an [Azure Resource Manager token](https://resources.azure.com/api/token?plaintext=true) which is an alphanumeric string of characters. This token does expire, so use it right away. You can also use the following Azure CLI command.
-See [Data storage and removal](luis-concept-data-storage.md#accounts) for information about what data is deleted when you delete your account.
+ ```azurecli
+ az account get-access-token --resource=https://management.core.windows.net/ --query accessToken --output tsv
+ ```
+
+1. Use the token to request the LUIS runtime resources across subscriptions. Use the [Get LUIS Azure accounts API](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5be313cec181ae720aa2b26c), which your user account has access to.
-## Change the pricing tier
+ This POST API requires the following values:
+
+ |Header|Value|
+ |--|--|
+ |`Authorization`|The value of `Authorization` is `Bearer {token}`. The token value must be preceded by the word `Bearer` and a space.|
+ |`Ocp-Apim-Subscription-Key`|Your authoring key.|
+
+ The API returns an array of JSON objects that represent your LUIS subscriptions. Returned values include the subscription ID, resource group, and resource name, returned as `AccountName`. Find the item in the array that's the LUIS resource that you want to assign to the LUIS app.
+
+1. Assign the token to the LUIS resource by using the [Unassign a LUIS Azure account from an application](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5be32554f8591db3a86232e1/console) API.
+
+ This DELETE API requires the following values:
+
+ |Type|Setting|Value|
+ |--|--|--|
+ |Header|`Authorization`|The value of `Authorization` is `Bearer {token}`. The token value must be preceded by the word `Bearer` and a space.|
+ |Header|`Ocp-Apim-Subscription-Key`|Your authoring key.|
+ |Header|`Content-type`|`application/json`|
+ |Querystring|`appid`|The LUIS app ID.
+ |Body||{`AzureSubscriptionId`: Your Subscription ID,<br>`ResourceGroup`: Resource Group name that has your prediction resource,<br>`AccountName`: Name of your prediction resource}|
-1. In [the Azure portal](https://portal.azure.com), find and select your LUIS subscription:
+ When this API is successful, it returns `200 - OK status`.
- ![Screenshot that shows a LUIS subscription in the Azure portal.](./media/luis-usage-tiers/find.png)
-1. Select **Pricing tier** to see the available pricing tiers:
+
- ![Screenshot that shows the Pricing tier menu item.](./media/luis-usage-tiers/subscription.png)
-1. Select the pricing tier, and then click **Select** to save your change:
+## Resource ownership
- ![Screenshot that shows how to select and save a pricing tier.](./media/luis-usage-tiers/plans.png)
+An Azure resource, like a LUIS resource, is owned by the subscription that contains the resource.
- When the pricing change is complete, a pop-up window verifies the pricing tier update:
+To change the ownership of a resource, you can take one of these actions:
+* Transfer the [ownership](../../cost-management-billing/manage/billing-subscription-transfer.md) of your subscription.
+* Export the LUIS app as a file, and then import the app on a different subscription. Export is available on the **My apps** page in the LUIS portal.
- ![Screenshot of the pop-up window that verifies the pricing update.](./media/luis-usage-tiers/updated.png)
-1. Remember to [assign this endpoint key](#assign-a-resource-to-an-app) on the **Publish** page and use it in all endpoint queries.
+## Resource limits
+
+### Authoring key creation limits
+
+You can create as many as 10 authoring keys per region, per subscription. Publishing regions are different from authoring regions. Make sure you create an app in the authoring region that corresponds to the publishing region where you want your client application to be located. For information on how authoring regions map to publishing regions, see [Authoring and publishing regions](luis-reference-regions.md).
+
+For more information on key limits, see [key limits](luis-limits.md#key-limits).
+
+### Errors for key usage limits
+
+Usage limits are based on the pricing tier.
+
+If you exceed your transactions-per-second (TPS) quota, you receive an HTTP 429 error. If you exceed your transaction-per-month (TPM) quota, you receive an HTTP 403 error.
+
+## Change the pricing tier
+
+1. In [the Azure portal](https://portal.azure.com), Go to **All resources** and select your resource
+
+ :::image type="content" source="./media/luis-usage-tiers/find.png" alt-text="Screenshot that shows a LUIS subscription in the Azure portal." lightbox="./media/luis-usage-tiers/find.png":::
+
+1. From the left side menu, select **Pricing tier** to see the available pricing tiers
+1. Select the pricing tier you want, and click **Select** to save your change. When the pricing change is complete, a notification will appear in the top right with the pricing tier update.
## View Azure resource metrics
-### View a summary of Azure resource usage
+## View a summary of Azure resource usage
You can view LUIS usage information in the Azure portal. The **Overview** page shows a summary, including recent calls and errors. If you make a LUIS endpoint request, allow up to five minutes for the change to appear.
-![Screenshot that shows the Overview page.](./media/luis-usage-tiers/overview.png)
-### Customizing Azure resource usage charts
-The **Metrics** page provides a more detailed view of the data:
+## Customizing Azure resource usage charts
+The **Metrics** page provides a more detailed view of the data.
+You can configure your metrics charts for a specific **time period** and **metric**.
-![Screenshot that shows the Metrics page.](./media/luis-usage-tiers/metrics-default.png)
-You can configure your metrics charts for a specific time period and metric type:
+## Total transactions threshold alert
+If you want to know when you reach a certain transaction threshold, for example 10,000 transactions, you can create an alert:
-![Screenshot that shows a customized chart.](./media/luis-usage-tiers/metrics-custom.png)
+1. From the left side menu select **Alerts**
+2. From the top menu select **New alert rule**
-### Total transactions threshold alert
-If you want to know when you reach a certain transaction threshold, for example 10,000 transactions, you can create an alert:
+ :::image type="content" source="./media/luis-usage-tiers/alerts.png" alt-text="Screenshot that shows the alert rules page." lightbox="./media/luis-usage-tiers/alerts.png":::
+
+3. Click on **Add condition**
-![Screenshot that shows the Alert rules page.](./media/luis-usage-tiers/alert-default.png)
+ :::image type="content" source="./media/luis-usage-tiers/alerts-2.png" alt-text="Screenshot that shows the add condition page for alert rules." lightbox="./media/luis-usage-tiers/alerts-2.png":::
+
+4. Select **Total calls**
+
+ :::image type="content" source="./media/luis-usage-tiers/alerts-3.png" alt-text="Screenshot that shows the total calls page for alerts." lightbox="./media/luis-usage-tiers/alerts-3.png":::
+
+5. Scroll down to the **Alert logic** section and set the attributes as you want and click **Done**
+
+ :::image type="content" source="./media/luis-usage-tiers/alerts-4.png" alt-text="Screenshot that shows the alert logic page." lightbox="./media/luis-usage-tiers/alerts-4.png":::
+
+6. To send notifications or invoke actions when the alert rule triggers go to the **Actions** section and add your action group.
+
+ :::image type="content" source="./media/luis-usage-tiers/alerts-5.png" alt-text="Screenshot that shows the actions page for alerts." lightbox="./media/luis-usage-tiers/alerts-5.png":::
+
+### Reset an authoring key
+
+For [migrated authoring resource](luis-migration-authoring.md) apps: If your authoring key is compromised, reset the key in the Azure portal, on the **Keys** page for the authoring resource.
+
+For apps that haven't been migrated: The key is reset on all your apps in the LUIS portal. If you author your apps via the authoring APIs, you need to change the value of `Ocp-Apim-Subscription-Key` to the new key.
+
+### Regenerate an Azure key
+
+You can regenerate an Azure key from the **Keys** page in the Azure portal.
+
+<a name="securing-the-endpoint"></a>
+
+## App ownership, access, and security
+
+An app is defined by its Azure resources, which are determined by the owner's subscription.
+
+You can move your LUIS app. Use the following resources to help you do so by using the Azure portal or Azure CLI:
+
+* [Move an app between LUIS authoring resources](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/apps-move-app-to-another-luis-authoring-azure-resource)
+* [Move a resource to a new resource group or subscription](../../azure-resource-manager/management/move-resource-group-and-subscription.md)
+* [Move a resource within the same subscription or across subscriptions](../../azure-resource-manager/management/move-limitations/app-service-move-limitations.md)
-Add a metric alert for the **total calls** metric for a certain time period. Add email addresses of all the people who should receive the alert. Add webhooks for all the systems that should receive the alert. You can also run a logic app when the alert is triggered.
## Next steps * Learn [how to use versions](luis-how-to-manage-versions.md) to control your app life cycle.
-* Migrate to the new [authoring resource](luis-migration-authoring.md).
++
cognitive-services Luis How To Batch Test https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-how-to-batch-test.md
Previously updated : 04/13/2021 Last updated : 05/18/2021
The example JSON includes one utterance with a labeled entity to illustrate what
5. Name the dataset `pizza test` and select **Done**.
-6. Select the **Run** button. After the batch test runs, select **See results**.
+6. Select the **Run** button.
+
+7. After the batch test completes, you can see the following columns:
+
+ | Column | Description |
+ | -- | - |
+ | State | Status of the test. **See results** is only visible after the test is completed. |
+ | Name | The name you have given to the test. |
+ | Size | Number of tests in this batch test file. |
+ | Last Run | Date of last run of this batch test file. |
+ | Last result | Number of successful predictions in the test. |
+
+8. To view detailed results of the test, select **See results**.
> [!TIP] > * Selecting **Download** will download the same file that you uploaded.
cognitive-services Luis How To Collaborate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-how-to-collaborate.md
Previously updated : 01/21/2021 Last updated : 05/17/2021
Learn more about Azure active directory users and consent:
## Next steps * Learn [how to use versions](luis-how-to-manage-versions.md) to control your app life cycle.
-* Understand the concepts including the [authoring resource](luis-how-to-azure-subscription.md#authoring-key) and [contributors](luis-how-to-azure-subscription.md#contributions-from-other-authors) on that resource.
+* Understand the about [authoring resources](luis-how-to-azure-subscription.md) and [adding contributors](luis-how-to-collaborate.md) on that resource.
* Learn [how to create](luis-how-to-azure-subscription.md) authoring and runtime resources
-* Migrate to the new [authoring resource](luis-migration-authoring.md)
+* Migrate to the new [authoring resource](luis-migration-authoring.md)
cognitive-services Luis How To Publish App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-how-to-publish-app.md
By using both publishing slots, this allows you to have two different versions o
### Publishing regions
-The app is published to all regions associated with the LUIS prediction endpoint resources added in the LUIS portal from the **Manage** -> **[Azure Resources](luis-how-to-azure-subscription.md#assign-a-resource-to-an-app)** page.
+The app is published to all regions associated with the LUIS prediction endpoint resources added in the LUIS portal from the **Manage** -> **[Azure Resources](luis-how-to-azure-subscription.md#assign-luis-resources)** page.
For example, for an app created on [www.luis.ai](https://www.luis.ai), if you create a LUIS resource in two regions, **westus** and **eastus**, and add these to the app as resources, the app is published in both regions. For more information about LUIS regions, see [Regions](luis-reference-regions.md).
cognitive-services Luis Migration Authoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-migration-authoring.md
Migration is the process of changing authoring authentication from an email acco
Migration has to be done from the [LUIS portal](https://www.luis.ai). If you create the authoring keys by using the LUIS CLI, for example, you'll need to complete the migration process in the LUIS portal. You can still have co-authors on your applications after migration, but these will be added on the Azure resource level instead of the application level. Migrating your account can't be reversed. > [!Note]
-> * If you need to create a prediction runtime resource, there's [a separate process](luis-how-to-azure-subscription.md#create-resources-in-the-azure-portal) to create it.
+> * If you need to create a prediction runtime resource, there's [a separate process](luis-how-to-azure-subscription.md#create-luis-resources) to create it.
> * See the [migration notes](#migration-notes) section below for information on how your applications and contributors will be affected. > * Authoring your LUIS app is free, as indicated by the F0 tier. Learn [more about pricing tiers](luis-limits.md#key-limits).
If you are having any issues with the migration that are not addressed in the tr
## Next steps * Review [concepts about authoring and runtime keys](luis-how-to-azure-subscription.md)
-* Review how to [assign keys](luis-how-to-azure-subscription.md) and [add contributors](luis-how-to-collaborate.md)
+* Review how to [assign keys](luis-how-to-azure-subscription.md) and [add contributors](luis-how-to-collaborate.md)
cognitive-services Luis Tutorial Node Import Utterances Csv https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-tutorial-node-import-utterances-csv.md
Previously updated : 09/05/2019 Last updated : 05/17/2021
LUIS provides a programmatic API that does everything that the [LUIS](luis-refer
## Prerequisites
-* Sign in to the [LUIS](luis-reference-regions.md) website and find your [authoring key](luis-how-to-azure-subscription.md#authoring-key) in Account Settings. You use this key to call the Authoring APIs.
+* Sign in to the [LUIS](luis-reference-regions.md) website and find your [authoring key](luis-how-to-azure-subscription.md) in Account Settings. You use this key to call the Authoring APIs.
* If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/cognitive-services/) before you begin. * This article starts with a CSV for a hypothetical company's log files of user requests. Download it [here](https://github.com/Azure-Samples/cognitive-services-language-understanding/blob/master/examples/build-app-programmatically-csv/IoT.csv). * Install the latest Node.js with NPM. Download it from [here](https://nodejs.org/en/download/).
cognitive-services Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/troubleshooting.md
description: This article contains answers to frequently asked questions about L
Previously updated : 04/16/2021 Last updated : 05/17/2021 # Language Understanding Frequently Asked Questions (FAQ)
Yes, it is good to train your **None** intent with more utterances as you add mo
See the [Bing Spell Check API V7](luis-tutorial-bing-spellcheck.md) tutorial. LUIS enforces limits imposed by Bing Spell Check API V7. ### How do I edit my LUIS app programmatically?
-To edit your LUIS app programmatically, use the [Authoring API](https://go.microsoft.com/fwlink/?linkid=2092087). See [Call LUIS authoring API](./get-started-get-model-rest-apis.md) and [Build a LUIS app programmatically using Node.js](./luis-tutorial-node-import-utterances-csv.md) for examples of how to call the Authoring API. The Authoring API requires that you use an [authoring key](luis-how-to-azure-subscription.md#azure-resources-for-luis) rather than an endpoint key. Programmatic authoring allows up to 1,000,000 calls per month and five transactions per second. For more info on the keys you use with LUIS, see [Manage keys](./luis-how-to-azure-subscription.md).
+To edit your LUIS app programmatically, use the [Authoring API](https://go.microsoft.com/fwlink/?linkid=2092087). See [Call LUIS authoring API](./get-started-get-model-rest-apis.md) and [Build a LUIS app programmatically using Node.js](./luis-tutorial-node-import-utterances-csv.md) for examples of how to call the Authoring API. The Authoring API requires that you use an [authoring key](luis-how-to-azure-subscription.md) rather than an endpoint key. Programmatic authoring allows up to 1,000,000 calls per month and five transactions per second. For more info on the keys you use with LUIS, see [Manage keys](./luis-how-to-azure-subscription.md).
### Where is the Pattern feature that provided regular expression matching? The previous **Pattern feature** is currently deprecated, replaced by **[Patterns](luis-concept-patterns.md)**.
Videos:
To learn more about LUIS, see the following resources: * [Stack Overflow questions tagged with LUIS](https://stackoverflow.com/questions/tagged/luis)
-* [Microsoft Q&A question page for MSDN Language Understanding Intelligent Services (LUIS)](/answers/topics/azure-language-understanding.html)
+* [Microsoft Q&A question page for MSDN Language Understanding Intelligent Services (LUIS)](/answers/topics/azure-language-understanding.html)
cognitive-services Tutorial Machine Learned Entity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/tutorial-machine-learned-entity.md
description: Extract structured data from an utterance using the machine-learnin
Previously updated : 05/08/2020 Last updated : 04/28/2020 #Customer intent: As a new user, I want to understand how to extract complex data contained in a user utterance.
The machine-learning entity supports the [model decomposition concept](luis-conc
> * Train, Test, Publish app > * Get entity prediction from endpoint
+<!
[!INCLUDE [LUIS Free account](includes/quickstart-tutorial-use-free-starter-key.md)]-
+-->
## Why use a machine-learning entity?
In order to receive a LUIS prediction in a chat bot or other client application,
In this tutorial, the app uses a machine-learning entity to find the intent of a user's utterance and extract details from that utterance. Using the machine-learning entity allows you to decompose the details of the entity. > [!div class="nextstepaction"]
-> [Add a prebuilt keyphrase entity](./luis-reference-prebuilt-keyphrase.md)
+> [Add a prebuilt keyphrase entity](./luis-reference-prebuilt-keyphrase.md)
cognitive-services What Is Luis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/what-is-luis.md
keywords: Azure, artificial intelligence, ai, natural language processing, nlp,
Previously updated : 04/16/2021 Last updated : 05/17/2021
This documentation contains the following article types:
- **Plan**: Identify the scenarios that users might use your application for. Define the actions and relevant information that needs to be recognized. - **Build**: Use your authoring resource to develop your app. Start by defining [intents](luis-concept-intent.md) and [entities](luis-concept-entity-types.md). Then, add training [utterances](luis-concept-utterance.md) for each intent. - **Test and Improve**: Start testing your model with other utterances to get a sense of how the app behaves, and you can decide if any improvement is needed. You can improve your application by following these [best practices](luis-concept-best-practices.md). -- **Publish**: Deploy your app for prediction and query the endpoint using your prediction resource. Learn more about authoring and prediction resources [here](luis-how-to-azure-subscription.md#luis-resources).
+- **Publish**: Deploy your app for prediction and query the endpoint using your prediction resource. Learn more about authoring and prediction resources [here](luis-how-to-azure-subscription.md).
- **Connect**: Connect to other services such as [Microsoft Bot framework](/composer/tutorial/tutorial-luis), [QnA Maker](../QnAMaker/choose-natural-language-processing-service.md), and [Speech service](../speech-service/get-started-intent-recognition.md). - **Refine**: [Review endpoint utterances](luis-concept-review-endpoint-utterances.md) to improve your application with real life examples
cognitive-services Create Faq Bot With Azure Bot Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/Tutorials/create-faq-bot-with-azure-bot-service.md
When you make changes to the knowledge base and republish, you don't need to tak
The chat bot responds with an answer from your knowledge base. :::image type="content" source="../media/qnamaker-create-publish-knowledge-base/test-web-chat.png" alt-text="Enter a user query into the test web chat.":::
-1. Light up the Bot in additional [supported channels](/azure/bot-service/bot-service-manage-channels).
+
+## Integrate the bot with channels
- * Click on **Channels** in the Bot service resource.
+Click on **Channels** in the Bot service resource that you have created. You can light up the Bot in additional [supported channels](/azure/bot-service/bot-service-manage-channels).
>[!div class="mx-imgBorder"] >![Screenshot of integration with teams](../media/qnamaker-tutorial-updates/connect-with-teams.png)+
cognitive-services Custom Neural Voice https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/custom-neural-voice.md
Last updated 02/01/2020
-# What is custom neural voice?
+# What is Custom Neural Voice?
Custom Neural Voice is a [text-to-Speech](./text-to-speech.md)
virtual assistant) or generate audio content offline (e.g., used as in
audio book or instructions in e-learning applications) with the text input provided by the user. This is made available via the [REST API](./rest-text-to-speech.md), the [Speech SDK](./get-started-text-to-speech.md?pivots=programming-language-csharp&tabs=script%2cwindowsinstall),
-or a [web portal](https://speech.microsoft.com/audiocontentcreation).
+or the [web portal](https://speech.microsoft.com/audiocontentcreation).
## Terms and definitions
To learn how to use Custom Neural Voice responsibly, see the [transparency note]
## Next steps
-* [Get started with Custom Voice](how-to-custom-voice.md)
-* [Create and use a Custom Voice endpoint](how-to-custom-voice-create-voice.md)
+* [Get started with Custom Neural Voice](how-to-custom-voice.md)
+* [Create and use your voice model](how-to-custom-voice-create-voice.md)
cognitive-services Faq Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/faq-text-to-speech.md
If you can't find answers to your questions in this FAQ, check out [other suppor
**Q: If I want to use a customized voice model, is the API the same as the one that's used for standard voices?**
-**A**: When a custom voice model is created and deployed, you get a unique endpoint for your model. To use the voice to speak in your apps, you must specify the endpoint in your HTTP requests. The same functionality that's available in the REST API for the Text to Speech service is available for your custom endpoint. Learn how to [create and use your custom endpoint](./how-to-custom-voice-create-voice.md#create-and-use-a-custom-voice-endpoint).
+**A**: When a custom voice model is created and deployed, you get a unique endpoint for your model. To use the voice to speak in your apps, you must specify the endpoint in your HTTP requests. The same functionality that's available in the REST API for the Text to Speech service is available for your custom endpoint. Learn how to [create and use your custom endpoint](./how-to-custom-voice-create-voice.md#create-and-use-a-custom-neural-voice-endpoint).
**Q: Do I need to prepare the training data to create custom voice models on my own?**
cognitive-services Get Started Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/get-started-text-to-speech.md
Previously updated : 10/01/2020 Last updated : 05/17/2021 zone_pivot_groups: programming-languages-set-twenty-four
keywords: text to speech
[!INCLUDE [C++ Basics include](includes/how-to/text-to-speech-basics/text-to-speech-basics-cpp.md)] ::: zone-end + ::: zone pivot="programming-language-java" [!INCLUDE [Java Basics include](includes/how-to/text-to-speech-basics/text-to-speech-basics-java.md)] ::: zone-end
cognitive-services How To Custom Voice Create Voice https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-custom-voice-create-voice.md
Last updated 11/04/2019
-# Create a Custom Voice
+# Create and use your voice model
-In [Prepare data for Custom Voice](how-to-custom-voice-prepare-data.md), we described the different data types you can use to train a custom voice and the different format requirements. Once you have prepared your data, you can start to upload them to the [Custom Voice portal](https://aka.ms/custom-voice-portal), or through the Custom Voice training API. Here we describe the steps of training a custom voice through the portal.
+In [Prepare data for your training](how-to-custom-voice-prepare-data.md), you learned about the different data types you can use to train a custom neural voice and the different format requirements. Once you've prepared your data and the voice talent verbal statement, you can start to upload them to the [Speech Studio](https://aka.ms/custom-voice-portal). In this article, you learn how to train a Custom Neural Voice through the Speech Studio portal. See the [supported languages](language-support.md#customization) for custom neural voice.
+
+## Prerequisites
+
+* Complete [get started with Custom Neural Voice](how-to-custom-voice.md)
+* [Prepare data for training](how-to-custom-voice-prepare-data.md)
+
+## Set up voice talent
+
+A voice talent is an individual or target speaker whose voices are recorded and used to create neural voice models. Before you create a voice, define your voice persona and select a right voice talent. For details on recording voice samples, see [the tutorial](record-custom-voice-samples.md).
+
+To train a neural voice, you must create a voice talent profile with an audio file recorded by the voice talent consenting to the usage of their speech data to train a custom voice model. When preparing your recording script, make sure you include the following sentence:
+
+**ΓÇ£I [state your first and last name] am aware that recordings of my voice will be used by [state
+the name of the company] to create and use a synthetic version of my voice.ΓÇ¥**
+This sentence is used to verify if the training data matches the audio in the consent statement. > Read more about the [voice talent verification](/legal/cognitive-services/speech-service/custom-neural-voice/data-privacy-security-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) here.
+
+> [!NOTE]
+> Custom Neural Voice is available with limited access. Make sure you understand the [responsible AI requirements](/legal/cognitive-services/speech-service/custom-neural-voice/limited-access-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext), and then [apply for access](https://aka.ms/customneural).
+
+The following steps assume you've prepared the voice talent verbal consent files. Go to [Speech Studio](https://aka.ms/custom-voice-portal) to select a custom neural voice project, then follow the following steps to create a voice talent profile.
+
+1. Navigate to **Text-to-Speech** > **Custom Voice** > **select a project** > **Set up voice talent**.
+
+2. Click **Add voice talent**.
+
+3. Next, to define voice characteristics, click **Target scenario** to be used. Then describe your **Voice characteristics**.
> [!NOTE]
-> This page assumes you have read [Get started with Custom Voice](how-to-custom-voice.md) and [Prepare data for Custom Voice](how-to-custom-voice-prepare-data.md), and have created a Custom Voice project.
+> The scenarios you provide must be consistent with what you've applied for in the application form.
-Check the languages supported for custom voice: [language for customization](language-support.md#customization).
+4. Then, go to **Upload voice talent statement**, follow the instruction to upload voice talent statement you've prepared beforehand.
+
+> [!NOTE]
+> Make sure the verbal statement is recorded in the same settings as your training data, including the recording environment and speaking style.
+
+5. Finally, go to **Review and submit**, you can review the settings and click **Submit**.
## Upload your datasets
-When you're ready to upload your data, go to the [Custom Voice portal](https://aka.ms/custom-voice-portal). Create or select a Custom Voice project. The project must share the right language/locale and the gender properties as the data you intend to use for your voice training. For example, select `en-GB` if the audio recordings you have is done in English with a UK accent.
+When you're ready to upload your data, go to the **Prepare training data** tab to add your first training set and upload data. A training set is a set of audio utterances and their mapping scripts used for training a voice model. You can use a training set to organize your training data. Data readiness checking will be done per each training set. You can import multiple datasets to a training set.
+
+You can do the following to create and review your training data.
-Go to the **Data** tab and click **Upload data**. In the wizard, select the correct data type that matches what you have prepared.
+1. On the **Prepare training data** tab, click **Add training set** to enter **Name** and **Description** > **Create** to add a new training set.
-Each dataset you upload must meet the requirements for the data type that you choose. It is important to correctly format your data before it's uploaded. This ensures the data will be accurately processed by the Custom Voice service. Go to [Prepare data for Custom Voice](how-to-custom-voice-prepare-data.md) and make sure your data has been rightly formatted.
+ When the training set is successfully created, you can start to upload your data.
+
+2. To upload data, click **Upload data** > **Choose data type** > **Upload data** and **Specify the target training set** > Enter **Name** and **Description** for your dataset > review the settings and click **Upload**.
> [!NOTE]
-> Free subscription (F0) users can upload two datasets simultaneously. Standard subscription (S0) users can upload five datasets simultaneously. If you reach the limit, wait until at least one of your datasets finishes importing. Then try again.
+>- Duplicate audio names will be removed from the training. Make sure the datasets you select don't contain the same audio names within the .zip file or across multiple .zip files. If utterance IDs (either in audio or script files) are duplicate, they'll be rejected.
+>- If you've created datasets in the previous version of Speech Studio, you must specify a training set for your datasets in advance to use them. Or else, an exclamation mark will be appended to the dataset name, and the dataset could not be used.
+
+Each dataset you upload must meet the requirements for the data type that you choose. It's important to correctly format your data before it's uploaded, which ensures the data will be accurately processed by the Custom Neural Voice service. Go to [Prepare data for your training](how-to-custom-voice-prepare-data.md) and make sure your data has been rightly formatted.
> [!NOTE]
-> The maximum number of datasets allowed to be imported per subscription is 10 .zip files for free subscription (F0) users and 500 for standard subscription (S0) users.
+> - Standard subscription (S0) users can upload five datasets simultaneously. If you reach the limit, wait until at least one of your datasets finishes importing. Then try again.
+> - The maximum number of datasets allowed to be imported per subscription is 10 .zip files for free subscription (F0) users and 500 for standard subscription (S0) users.
-Datasets are automatically validated once you hit the upload button. Data validation includes series of checks on the audio files to verify their file format, size, and sampling rate. Fix the errors if any and submit again. When the data-importing request is successfully initiated, you should see an entry in the data table that corresponds to the dataset youΓÇÖve just uploaded.
+Datasets are automatically validated once you hit the **Upload** button. Data validation includes series of checks on the audio files to verify their file format, size, and sampling rate. Fix the errors if any and submit again.
-The following table shows the processing states for imported datasets:
+Once the data is uploaded, you can check the details in the training set detail view. On the **Overview** tab, you can further check the pronunciation scores and the noise level for each of your datasets. The pronunciation score ranges from 0 to 100. A score below 70 normally indicates a speech error or script mismatch. A heavy accent can reduce your pronunciation score and impact the generated digital voice.
-| State | Meaning |
-| -- | - |
-| Processing | Your dataset has been received and is being processed. |
-| Succeeded | Your dataset has been validated and may now be used to build a voice model. |
-| Failed | Your dataset has been failed during processing due to many reasons, for example file errors, data problems or network issues. |
+A higher signal-to-noise ratio (SNR) indicates lower noise in your audio. You can typically reach a 50+ SNR by recording at professional studios. Audio with an SNR below 20 can result in obvious noise in your generated voice.
-After validation is complete, you can see the total number of matched utterances for each of your datasets in the **Utterances** column. If the data type you have selected requires long-audio segmentation, this column only reflects the utterances we have segmented for you either based on your transcripts or through the speech transcription service. You can further download the dataset validated to view the detail results of the utterances successfully imported and their mapping transcripts. Hint: long-audio segmentation can take more than an hour to complete data processing.
+Consider re-recording any utterances with low pronunciation scores or poor signal-to-noise ratios. If you can't re-record, consider excluding those utterances from your dataset.
-In the data detail view, you can further check the pronunciation scores and the noise level for each of your datasets. The pronunciation score ranges from 0 to 100. A score below 70 normally indicates a speech error or script mismatch. A heavy accent can reduce your pronunciation score and impact the generated digital voice.
+On the **Data details**, you can check the data details of the training set. In case of some typical issues with the data, follow the instructions in the message displayed to fix them before training.
-A higher signal-to-noise ratio (SNR) indicates lower noise in your audio. You can typically reach a 50+ SNR by recording at professional studios. Audio with an SNR below 20 can result in obvious noise in your generated voice.
+The issues are divided into three types. Referring to the following three tables to check the respective types of errors.
-Consider re-recording any utterances with low pronunciation scores or poor signal-to-noise ratios. If you can't re-record, you might exclude those utterances from your dataset.
+The first type of errors listed in the table below must be fixed manually, otherwise the data with these errors will be excluded during training.
-> [!NOTE]
-> It is required that if you are using Custom Neural Voice, you must register your voice talent in the **Voice Talent** tab. When preparing your recording script, make sure you include the below sentence to acquire the voice talent acknowledgement of using their voice data to create a TTS voice model and generate synthetic speech.
-ΓÇ£I [state your first and last name] am aware that recordings of my voice will be used by [state the name of the company] to create and use a synthetic version of my voice.ΓÇ¥
-This sentence will be used to verify if the recordings in your training datasets are done by the same person that makes the consent. [Read more about how your data will be processed and how voice talent verification is done here](/legal/cognitive-services/speech-service/custom-neural-voice/data-privacy-security-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext).
+| Category | Name | Description | Suggestion |
+| | -- | -- | |
+| Script | Invalid separator| These script lines don't have valid separator TAB:{}.| Use TAB to separate ID and content.|
+| Script | Invalid script ID| Script ID format is invalid.| Script line ID should be numeric.|
+| Script | Script content duplicated| Line {} script content is duplicated with line {}.| Script line content should be unique.|
+| Script | Script content too long| Script line content is longer than maximum 1000.| Script line content length should be less than 1000 characters.|
+| Script | Script has no matching audio| Script line ID doesn't have matching audio.| Script line ID should match audio ID.|
+| Script | No valid script| No valid script found in this dataset.| Fix the problematic script lines according to detailed issue list.|
+| Audio | Audio has no matching script| Audio file doesn't match script ID.| Wav file name should match ID in script file.|
+| Audio | Invalid audio format| Wav file has an invalid format and cannot be read.| Check wav file format by audio tool like sox.|
+| Audio | Low sampling rate| Audio sampling rate is lower than 16 KHz. | Wav file sampling rate should be equal to or higher than 16 KHz. |
+| Audio | Audio duration too long| Audio duration is longer than 30 seconds.| Split long duration audio to multiple files to make sure each is less than 15 seconds.|
+| Audio | No valid audio| No valid audio found in this dataset.| Fix the problematic audio according to detailed issue list.|
-## Build your custom voice model
+The second type of errors listed in the table below will be automatically fixed, but double checking the fixed data is recommended.
-After your dataset has been validated, you can use it to build your custom voice model.
+| Category | Name | Description | Suggestion |
+| | -- | -- | |
+| Audio | Stereo audio | Only one channel in stereo audio will be used for TTS model training.| Use mono in TTS recording or data preparation. This audio is converted into mono. Download normalized dataset and review.|
+| Volume | Volume peak out of range |Volume peak is not within range -3 dB (70% of max volume) to -6 dB (50%). It's auto adjusted to -4 dB (65%) now.| Control volume peak to proper range during recording or data preparation. This audio is linear scaled to fit the peak range. Download normalized dataset and review.|
+|Mismatch | Long silence detected before first word | Long silence detected before first word.| The start silence is trimmed to 200 ms. Download normalized dataset and review. |
+| Mismatch | Long silence detected after last word | Long silence detected after last word. | The end silence is trimmed to 200 ms. Download normalized dataset and review. |
+| Mismatch |Start silence too short | Start silence is shorter than 100 ms. | The start silence is extended to 100 ms. Download normalized dataset and review. |
+| Mismatch | End silence too short | End silence is shorter than 100 ms. | The end silence is extended to 100 ms. Download normalized dataset and review. |
-1. Navigate to **Text-to-Speech > Custom Voice > [name of project] > Model**.
+If the third type of errors listed in the table below aren't fixed, although the data with these errors won't be excluded during training, it will affect the quality of training. For higher-quality training, manually fixing these errors is recommended.
-2. Click **Train model**.
+| Category | Name | Description | Suggestion |
+| | -- | -- | |
+| Script | Contain digit 0-9| These script lines contain digit 0-9.| The script lines contain digit 0-9. Expand them to normalized words and match with audio. For example, '123' to 'one hundred and twenty three'.|
+| Script | Pronunciation confused word '{}' | Script contains pronunciation confused word: '{}'.| Expand word to its actual pronunciation. For example, {}.|
+| Script | Question utterances too few| Question script lines are less than 1/6 of total script lines.| Question script lines should be at least 1/6 of total lines for voice font properly expressing question tone.|
+| Script | Exclamation utterances too few| Exclamation script lines are less than 1/6 of total script lines.| Exclamation script lines should be at least 1/6 of total lines for voice font properly expressing Exclamation tone.|
+| Audio| Low sampling rate for neural voice | Audio sampling rate is lower than 24 KHz.| Wav file sampling rate should be equal or higher than 24 KHz for high-quality neural voice.|
+| Volume | Overall volume too low | Volume of {} samples is lower than -18 dB (10% of max volume).| Control volume average level to proper range during recording or data preparation.|
+| Volume | Volume truncation| Volume truncation is detected at {}s.| Adjust recording equipment to avoid volume truncation at its peak value.|
+| Volume | Start silence not clean | First 100 ms silence isn't clean. Detect volume larger than -40 dB (1% of max volume).| Reduce recording noise floor level and leave the starting 100 ms as silence.|
+| Volume| End silence not clean| Last 100 ms silence isn't clean. Detect volume larger than -40 dB (1% of max volume).| Reduce recording noise level and leave the end 100 ms as silence.|
+| Mismatch | Script audio mismatch detected| There's a mismatch between script and audio content. | Review script and audio content to make sure they match and control the noise floor level. Reduce the long silence length or split into multiple utterances.|
+| Mismatch | Extra audio energy detected before first word | Extra audio energy detected before first word. It may also be because of too short start silence before first word.| Review script and audio content to make sure they match and control the noise floor level. Also leave 100 ms silence before first word.|
+| Mismatch | Extra audio energy detected after last word| Extra audio energy detected after last word. It may also be because of too short silence after last word.| Review script and audio content to make sure they match and control the noise floor level. Also leave 100 ms silence after last word.|
+| Mismatch | Low signal-noise ratio | Audio SNR level is lower than {} dB.| Reduce audio noise level during recording or data preparation.|
+| Mismatch | Recognize speech content fail | Fail to do speech recognition on this audio.| Check audio and script content to make sure the audio is valid speech, and match with script.|
-3. Next, enter a **Name** and **Description** to help you identify this model.
+## Train your custom neural voice model
- Choose a name carefully. The name you enter here will be the name you use to specify the voice in your request for speech synthesis as part of the SSML input. Only letters, numbers, and a few punctuation characters such as -, \_, and (', ') are allowed. Use different names for different voice models.
+After your dataset has been validated, you can use it to build your custom neural voice model.
- A common use of the **Description** field is to record the names of the datasets that were used to create the model.
+1. On the **Train model** tab, click **Train model** to create a voice model with the data you have uploaded.
-4. From the **Select training data** page, choose one or multiple datasets that you would like to use for training. Check the number of utterances before you submit them. You can start with any number of utterances for en-US and zh-CN voice models using the "Adaptive" training method. For other locales, you must select more than 2,000 utterances to be able to train a voice using a standard tier including the "Statistical parametric" and "Concatenative" training methods, and more than 300 utterances to train a custom neural voice.
+2. Select the neural training method for your model and target language.
- > [!NOTE]
- > Duplicate audio names will be removed from the training. Make sure the datasets you select do not contain the same audio names across multiple .zip files.
+By default, your voice model is trained in the same language of your training data. You can also select to create a secondary language (preview) for your voice model. Check the languages supported for custom neural voice: [language for customization](language-support.md#customization).
- > [!TIP]
- > Using the datasets from the same speaker is required for quality results. Different training methods require different training data size. To train a model with the "Statistical parametric" method, at least 2,000 distinct utterances are required. For the "Concatenative" method, it's 6,000 utterances, while for "Neural", the minimum data size requirement is 300 utterances.
+3. Next, choose the dataset you want to use for training, and specify a speaker file.
-5. Select the **training method** in the next step.
+>[!NOTE]
+>- You need to select at least 300 utterances to create a custom neural voice.
+>- To train a neural voice, you must specify a voice talent profile with the audio consent file provided of the voice talent acknowledging to use his/her speech data to train a custom voice model. Custom Neural Voice is available with limited access. Make sure you understand the [responsible AI requirements](/legal/cognitive-services/speech-service/custom-neural-voice/limited-access-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) and [apply the access here](https://aka.ms/customneural).
+>- On this page you can also select to upload your script for testing. The testing script must be a txt file, less than 1Mb. Supported encoding format includes ANSI/ASCII, UTF-8, UTF-8-BOM, UTF-16-LE, or UTF-16-BE. Each paragraph of the utterance will result in a separate audio. If you want to combine all sentences into one audio, make them in one paragraph.
- > [!NOTE]
- > If you would like to train a neural voice, you must specify a voice talent profile with the audio consent file provided of the voice talent acknowledging to use his/her speech data to train a custom voice model. Custom Neural Voice is available with limited access. Make sure you understand the [responsible AI requirements](/legal/cognitive-services/speech-service/custom-neural-voice/limited-access-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) and [apply the access here](https://aka.ms/customneural).
-
- On this page you can also select to upload your script for testing. The testing script must be a txt file, less than 1Mb. Supported encoding format includes ANSI/ASCII, UTF-8, UTF-8-BOM, UTF-16-LE, or UTF-16-BE. Each paragraph of the utterance will result in a separate audio. If you want to combine all sentences into one audio, make them in one paragraph.
+4. Then, enter a **Name** and **Description** to help you identify this model.
-6. Click **Train** to begin creating your voice model.
+Choose a name carefully. The name you enter here will be the name you use to specify the voice in your request for speech synthesis as part of the SSML input. Only letters, numbers, and a few punctuation characters such as -, \_, and (', ') are allowed. Use different names for different neural voice models.
+
+A common use of the **Description** field is to record the names of the datasets that were used to create the model.
+
+5. Review the settings, then click **Submit** to start training the model.
+
+> [!NOTE]
+> Duplicate audio names will be removed from the training. Make sure the datasets you select don't contain the same audio names across multiple .zip files.
-The Training table displays a new entry that corresponds to this newly created model. The table also displays the status: Processing, Succeeded, Failed.
+The **Train model** table displays a new entry that corresponds to this newly created model. The table also displays the status: Processing, Succeeded, Failed.
The status that's shown reflects the process of converting your dataset to a voice model, as shown here.
The status that's shown reflects the process of converting your dataset to a voi
| -- | - | | Processing | Your voice model is being created. | | Succeeded | Your voice model has been created and can be deployed. |
-| Failed | Your voice model has been failed in training due to many reasons, for example unseen data problems or network issues. |
+| Failed | Your voice model has been failed in training due to many reasons, for example, unseen data problems or network issues. |
-Training time varies depending on the volume of audio data processed and the training method you have selected. It can range from 30 minutes to 40 hours. Once your model training is succeeded, you can start to test it.
+Training duration varies depending on how much data you're training. It takes about 40 compute hours on average to train a custom neural voice.
> [!NOTE]
-> Free subscription (F0) users can train one voice font simultaneously. Standard subscription (S0) users can train three voices simultaneously. If you reach the limit, wait until at least one of your voice fonts finishes training, and then try again.
+> Training of custom neural voices isn't free. Check the [pricing](https://azure.microsoft.com/pricing/details/cognitive-services/speech-services/) here. Standard subscription (S0) users can train three voices simultaneously. If you reach the limit, wait until at least one of your voice fonts finishes training, and then try again.
-> [!NOTE]
-> Training of custom neural voices is not free. Check the [pricing](https://azure.microsoft.com/pricing/details/cognitive-services/speech-services/) here.
-
-> [!NOTE]
-> The maximum number of voice models allowed to be trained per subscription is 10 models for free subscription (F0) users and 100 for standard subscription (S0) users.
-
-If you are using the neural voice training capability, you can select to train a model optimized for real-time streaming scenarios, or a HD neural model optimized for asynchronous [long-audio synthesis](long-audio-api.md).
-
-## Test your voice model
+6. After you finish training the model successfully, you can review the model details.
Each training will generate 100 sample audio files automatically to help you test the model. After your voice model is successfully built, you can test it before deploying it for use.
-1. Navigate to **Text-to-Speech > Custom Voice > [name of project] > Model**.
+The quality of the voice depends on many factors, including the size of the training data, the quality of the recording, the accuracy of the transcript file, how well the recorded voice in the training data matches the personality of the designed voice for your intended use case, and more. [Check here to learn more about the capabilities and limits of our technology and the best practice to improve your model quality](/legal/cognitive-services/speech-service/custom-neural-voice/characteristics-and-limitations-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext).
-2. Click the name of the model you would like to test.
+## Create and use a custom neural voice endpoint
-3. On the model detail page, you can find the sample audio files under the **Testing** tab.
+After you've successfully created and tested your voice model, you deploy it in a custom Text-to-Speech endpoint. You then use this endpoint in place of the usual endpoint when making Text-to-Speech requests through the REST API. Your custom endpoint can be called only by the subscription that you've used to deploy the font.
-The quality of the voice depends on a number of factors, including the size of the training data, the quality of the recording, the accuracy of the transcript file, how well the recorded voice in the training data matches the personality of the designed voice for your intended use case, and more. [Check here to learn more about the capabilities and limits of our technology and the best practice to improve your model quality](/legal/cognitive-services/speech-service/custom-neural-voice/characteristics-and-limitations-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext).
+You can do the following to create a custom neural voice endpoint.
-## Create and use a custom voice endpoint
+1. On the **Deploy model** tab, click **Add endpoint**.
+2. Next, select a voice model you would like to associate with this endpoint.
+3. Then, enter a **Name** and **Description** for your custom endpoint.
+4. Finally, you can review the settings and click **Create** to create your endpoint.
-After you've successfully created and tested your voice model, you deploy it in a custom Text-to-Speech endpoint. You then use this endpoint in place of the usual endpoint when making Text-to-Speech requests through the REST API. Your custom endpoint can be called only by the subscription that you have used to deploy the font.
-
-To create a new custom voice endpoint, go to **Text-to-Speech > Custom Voice > Endpoint**. Select **Add endpoint** and enter a **Name** and **Description** for your custom endpoint. Then select the custom voice model you would like to associate with this endpoint.
-
-After you have clicked the **Add** button, in the endpoint table, you will see an entry for your new endpoint. It may take a few minutes to instantiate a new endpoint. When the status of the deployment is **Succeeded**, the endpoint is ready for use.
+After you've clicked the **Create** button, in the endpoint table, you'll see an entry for your new endpoint. It may take a few minutes to instantiate a new endpoint. When the status of the deployment is **Succeeded**, the endpoint is ready for use.
You can **Suspend** and **Resume** your endpoint if you don't use it all the time. When an endpoint is reactivated after suspension, the endpoint URL will be kept the same so you don't need to change your code in your apps.
-You can also update the endpoint to a new model. To change the model, make sure the new model is named the same as the one your want to update.
-
-> [!NOTE]
-> Free subscription (F0) users can have only one model deployed. Standard subscription (S0) users can create up to 50 endpoints, each with its own custom voice.
+You can also update the endpoint to a new model. To change the model, make sure the new model is named the same as the one you want to update.
> [!NOTE]
-> To use your custom voice, you must specify the voice model name, use the custom URI directly in an HTTP request, and use the same subscription to pass through the authentication of TTS service.
+>- Standard subscription (S0) users can create up to 50 endpoints, each with its own custom neural voice.
+>- To use your custom neural voice, you must specify the voice model name, use the custom URI directly in an HTTP request, and use the same subscription to pass through the authentication of TTS service.
After your endpoint is deployed, the endpoint name appears as a link. Click the link to display information specific to your endpoint, such as the endpoint key, endpoint URL, and sample code.
-Online testing of the endpoint is also available via the custom voice portal. To test your endpoint, choose **Check endpoint** from the **Endpoint detail** page. The endpoint testing page appears. Enter the text to be spoken (in either plain text or [SSML format](speech-synthesis-markup.md) in the text box. To hear the text spoken in your custom voice font, select **Play**. This testing feature will be charged against your custom speech synthesis usage.
+The custom endpoint is functionally identical to the standard endpoint that's used for text-to-speech requests. For more information, see [Speech SDK](./get-started-text-to-speech.md) or [REST API](rest-text-to-speech.md).
-The custom endpoint is functionally identical to the standard endpoint that's used for text-to-speech requests. See [REST API](rest-text-to-speech.md) for more information.
+We also provide an online tool, [Audio Content Creation](https://speech.microsoft.com/audiocontentcreation), that allows you to fine-tune their audio output using a friendly UI.
## Next steps
-* [Guide: Record your voice samples](record-custom-voice-samples.md)
-* [Text-to-Speech API reference](rest-text-to-speech.md)
-* [Long Audio API](long-audio-api.md)
+- [Tutorial for voice sample recording](record-custom-voice-samples.md)
+- [Text-to-Speech API reference](rest-text-to-speech.md)
+- [Long Audio API](long-audio-api.md)
cognitive-services How To Custom Voice Prepare Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-custom-voice-prepare-data.md
Last updated 11/04/2019
-
-# Prepare data to create a custom voice
+# Prepare training data
When you're ready to create a custom text-to-speech voice for your application, the first step is to gather audio recordings and associated scripts to start training the voice model. The Speech service uses this data to create a unique voice tuned to match the voice in the recordings. After you've trained the voice, you can start synthesizing speech in your applications.
+## Voice talent verbal statement
+ Before you can train your own text-to-speech voice model, you'll need audio recordings and the associated text transcriptions. On this page, we'll review data types, how they are used, and how to manage each. > [!NOTE]
-> If you would like to train a neural voice, you must specify a voice talent profile with the audio consent file provided of the voice talent acknowledging to use his/her speech data to train a custom voice model. When preparing your recording script, make sure you include the below sentence.
+> To train a neural voice, you must create a voice talent profile with an audio file recorded by the voice talent consenting to the usage of their speech data to train a custom voice model. When preparing your recording script, make sure you include the below sentence.
> ΓÇ£I [state your first and last name] am aware that recordings of my voice will be used by [state the name of the company] to create and use a synthetic version of my voice.ΓÇ¥ This sentence will be used to verify if the training data is done by the same person that makes the consent. Read more about the [voice talent verification](/legal/cognitive-services/speech-service/custom-neural-voice/data-privacy-security-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) here. > Custom Neural Voice is available with limited access. Make sure you understand the [responsible AI requirements](/legal/cognitive-services/speech-service/custom-neural-voice/limited-access-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) and [apply the access here](https://aka.ms/customneural).
-## Data types
+## Types of training data
A voice training dataset includes audio recordings, and a text file with the associated transcriptions. Each audio file should contain a single utterance (a single sentence or a single turn for a dialog system), and be less than 15 seconds long.
-In some cases, you may not have the right dataset ready and will want to test the custom voice training with available audio files, short or long, with or without transcripts. We provide tools (beta) to help you segment your audio into utterances and prepare transcripts using the [Batch Transcription API](batch-transcription.md).
+In some cases, you may not have the right dataset ready and will want to test the custom neural voice training with available audio files, short or long, with or without transcripts. We provide tools (beta) to help you segment your audio into utterances and prepare transcripts using the [Batch Transcription API](batch-transcription.md).
This table lists data types and how each is used to create a custom text-to-speech voice model.
-| Data type | Description | When to use | Additional processing required |
+| Data type | Description | When to use | Additional processing required |
| | -- | -- | | | **Individual utterances + matching transcript** | A collection (.zip) of audio files (.wav) as individual utterances. Each audio file should be 15 seconds or less in length, paired with a formatted transcript (.txt). | Professional recordings with matching transcripts | Ready for training. |
-| **Long audio + transcript (beta)** | A collection (.zip) of long, unsegmented audio files (longer than 20 seconds), paired with a transcript (.txt) that contains all spoken words. | You have audio files and matching transcripts, but they are not segmented into utterances. | Segmentation (using batch transcription).<br>Audio format transformation where required. |
-| **Audio only (beta)** | A collection (.zip) of audio files without a transcript. | You only have audio files available, without transcripts. | Segmentation + transcript generation (using batch transcription).<br>Audio format transformation where required.|
+| **Long audio + transcript (beta)** | A collection (.zip) of long, unsegmented audio files (longer than 20 seconds), paired with a transcript (.txt) that contains all spoken words. | You have audio files and matching transcripts, but they are not segmented into utterances. | Segmentation (using batch transcription).<br>Audio format transformation where required. |
+| **Audio only (beta)** | A collection (.zip) of audio files without a transcript. | You only have audio files available, without transcripts. | Segmentation + transcript generation (using batch transcription).<br>Audio format transformation where required.|
Files should be grouped by type into a dataset and uploaded as a zip file. Each dataset can only contain a single data type.
You can prepare recordings of individual utterances and the matching transcript
To produce a good voice model, create the recordings in a quiet room with a high-quality microphone. Consistent volume, speaking rate, speaking pitch, and expressive mannerisms of speech are essential. > [!TIP]
-> To create a voice for production use, we recommend you use a professional recording studio and voice talent. For more information, see [How to record voice samples for a custom voice](record-custom-voice-samples.md).
+> To create a voice for production use, we recommend you use a professional recording studio and voice talent. For more information, see [record voice samples to create a Custom Neural Voice](record-custom-voice-samples.md).
### Audio files
Follow these guidelines when preparing audio.
| Property | Value | | -- | -- | | File format | RIFF (.wav), grouped into a .zip file |
-| Sampling rate | At least 16,000 Hz |
+| Sampling rate | At least 16,000 Hz. For creating a neural voice, 24,000 Hz is required. |
| Sample format | PCM, 16-bit | | File name | Numeric, with .wav extension. No duplicate file names allowed. | | Audio length | Shorter than 15 seconds |
Follow these guidelines when preparing audio.
| Maximum archive size | 2048 MB | > [!NOTE]
-> .wav files with a sampling rate lower than 16,000 Hz will be rejected. If a .zip file contains .wav files with different sample rates, only those equal to or higher than 16,000 Hz will be imported. The portal currently imports .zip archives up to 200 MB. However, multiple archives can be uploaded.
+> .wav files with a sampling rate lower than 16,000 Hz will be rejected. If a .zip file contains .wav files with different sample rates, only those equal to or higher than 16,000 Hz will be imported. The portal currently imports .zip archives up to 2048 MB. However, multiple archives can be uploaded.
+
+> [!NOTE]
+> The default sampling rate for a custom neural voice is 24,000 Hz. Your .wav files with a sampling rate lower than 16,000 Hz will be up-sampled to 24,000 Hz to train a neural voice. ItΓÇÖs recommended that you should use a sample rate of 24,000 Hz for your training data.
### Transcripts
ItΓÇÖs important that the transcripts are 100% accurate transcriptions of the co
## Long audio + transcript (beta)
-In some cases, you may not have segmented audio available. We provide a service (beta) through the custom voice portal to help you segment long audio files and create transcriptions. Keep in mind, this service will be charged toward your speech-to-text subscription usage.
+In some cases, you may not have segmented audio available. We provide a service (beta) through the Speech Studio to help you segment long audio files and create transcriptions. Keep in mind, this service will be charged toward your speech-to-text subscription usage.
> [!NOTE] > The long-audio segmentation service will leverage the batch transcription feature of speech-to-text, which only supports standard subscription (S0) users. During the processing of the segmentation, your audio files and the transcripts will also be sent to the Custom Speech service to refine the recognition model so the accuracy can be improved for your data. No data will be retained during this process. After the segmentation is done, only the utterances segmented and their mapping transcripts will be stored for your downloading and training.
Follow these guidelines when preparing audio for segmentation.
| Archive format | .zip | | Maximum archive size | 2048 MB |
+> [!NOTE]
+> The default sampling rate for a custom neural voice is 24,000 Hz. Your .wav files with a sampling rate lower than 16,000 Hz will be up sampled to 24,000 Hz to train a neural voice. ItΓÇÖs recommended that you should use a sample rate of 24,000 Hz for your training data.
+ All audio files should be grouped into a zip file. ItΓÇÖs OK to put .wav files and .mp3 files into one audio zip. For example, you can upload a zip file containing an audio file named ΓÇÿkingstory.wavΓÇÖ, 45-second-long, and another audio named ΓÇÿqueenstory.mp3ΓÇÖ, 200-second-long. All .mp3 files will be transformed into the .wav format after processing. ### Transcripts
After your dataset is successfully uploaded, we will help you segment the audio
## Audio only (beta)
-If you don't have transcriptions for your audio recordings, use the **Audio only** option to upload your data. Our system can help you segment and transcribe your audio files. Keep in mind, this service will count toward your speech-to-text subscription usage.
+If you don't have transcriptions for your audio recordings, use the **Audio only** option to upload your data. Our system can help you segment and transcribe your audio files. Keep in mind, this service will be charged toward your speech-to-text subscription usage.
Follow these guidelines when preparing audio.
Follow these guidelines when preparing audio.
| Archive format | .zip | | Maximum archive size | 2048 MB |
+> [!NOTE]
+> The default sampling rate for a custom neural voice is 24,000 Hz. Your .wav files with a sampling rate lower than 16,000 Hz will be up sampled to 24,000 Hz to train a neural voice. ItΓÇÖs recommended that you should use a sample rate of 24,000 Hz for your training data.
+ All audio files should be grouped into a zip file. Once your dataset is successfully uploaded, we will help you segment the audio file into utterances based on our speech batch transcription service. Unique IDs will be assigned to the segmented utterances automatically. Matching transcripts will be generated through speech recognition. All .mp3 files will be transformed into the .wav format after processing. You can check the segmented utterances and the matching transcripts by downloading the dataset. ## Next steps -- [Create a Custom Voice](how-to-custom-voice-create-voice.md)-- [Guide: Record your voice samples](record-custom-voice-samples.md)
+- [Create and use your voice model](how-to-custom-voice-create-voice.md)
+- [Tutorial for voice sample recording](record-custom-voice-samples.md)
cognitive-services How To Custom Voice https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-custom-voice.md
The diagram below highlights the steps to create a custom voice model using the
2. [Upload data](how-to-custom-voice-create-voice.md#upload-your-datasets) - Upload data (audio and text) using the Custom Voice portal or Custom Voice API. From the portal, you can investigate and evaluate pronunciation scores and signal-to-noise ratios. For more information, see [How to prepare data for Custom Voice](how-to-custom-voice-prepare-data.md).
-3. [Train your model](how-to-custom-voice-create-voice.md#build-your-custom-voice-model) ΓÇô Use your data to create a custom text-to-speech voice model. You can train a model in different languages. After training, test your model, and if you're satisfied with the result, you can deploy the model.
+3. [Train your model](how-to-custom-voice-create-voice.md#train-your-custom-neural-voice-model) ΓÇô Use your data to create a custom text-to-speech voice model. You can train a model in different languages. After training, test your model, and if you're satisfied with the result, you can deploy the model.
-4. [Deploy your model](how-to-custom-voice-create-voice.md#create-and-use-a-custom-voice-endpoint) - Create a custom endpoint for your text-to-speech voice model, and use it for speech synthesis in your products, tools, and applications.
+4. [Deploy your model](how-to-custom-voice-create-voice.md#create-and-use-a-custom-neural-voice-endpoint) - Create a custom endpoint for your text-to-speech voice model, and use it for speech synthesis in your products, tools, and applications.
## Custom Neural voices
Custom Voice currently supports both standard and neural tiers. Custom Neural Vo
A Speech service subscription is required before you can use the Custom Speech portal to create a custom model. Follow these instructions to create a Speech service subscription in Azure. If you do not have an Azure account, you can sign up for a new one.
-Once you've created an Azure account and a Speech service subscription, you'll need to sign in to the Custom Voice portal and connect your subscription.
+Once you've created an Azure account and a Speech service subscription, you'll need to sign in Speech Studio and connect your subscription.
1. Get your Speech service subscription key from the Azure portal.
-2. Sign in to the [Custom Voice portal](https://aka.ms/custom-voice).
+2. Sign in to [Speech Studio](https://speech.microsoft.com), then click **Custom Voice**.
3. Select your subscription and create a speech project. 4. If you'd like to switch to another Speech subscription, use the cog icon located in the top navigation. > [!NOTE] > You must have a F0 or a S0 Speech service key created in Azure before you can use the service. Custom Neural Voice only supports the S0 tier.
-## How to create a project
+## Create a project
-Content like data, models, tests, and endpoints are organized into **Projects** in the Custom Voice portal. Each project is specific to a country/language and the gender of the voice you want to create. For example, you may create a project for a female voice for your call center's chat bots that use English in the United States ('en-US').
+Content like data, models, tests, and endpoints are organized into **Projects** in Speech Studio. Each project is specific to a country/language and the gender of the voice you want to create. For example, you may create a project for a female voice for your call center's chat bots that use English in the United States ('en-US').
-To create your first project, select the **Text-to-Speech/Custom Voice** tab, then click **New Project**. Follow the instructions provided by the wizard to create your project. After you've created a project, you will see four tabs: **Data**, **Training**, **Testing**, and **Deployment**. Use the links provided in [Next steps](#next-steps) to learn how to use each tab.
+To create your first project, select the **Text-to-Speech/Custom Voice** tab, then click **Create project**. Follow the instructions provided by the wizard to create your project. After you've created a project, you will see four tabs: **Set up voice talent**, **Prepare training data**, **Train model**, and **Deploy model**. Use the links provided in [Next steps](#next-steps) to learn how to use each tab.
> [!IMPORTANT] > The [Custom Voice portal](https://aka.ms/custom-voice) was recently updated! If you created previous data, models, tests, and published endpoints in the CRIS.ai portal or with APIs, you need to create a new project in the new portal to connect to these old entities.
-## How to migrate to Custom Neural Voice
+## Tips for creating a custom neural voice
-The standard/non-neural training tier (adaptive, statistical parametric, concacenative) of Custom Voice is being deprecated. The annoucement has been sent out to all existing Speech subscriptions before 2/28/2021. During the deprecation period (3/1/2021 - 2/29/2024), existing standard tier users can continue to use their non-neural models created. All new users/new speech resources should move to the neural tier/Custom Neural Voice. After 2/29/2024, all standard/non-neural custom voices will no longer be supported.
+Creating a great custom voice requires careful quality control in each step, from voice design and data preparation, to the deployment of the voice model to your system. Below are some key steps to take when creating a custom neural voice for your organization.
+
+### Persona design
+
+First, design a persona of the voice that represents your brand using a persona brief document that defines elements such as the features of the voice, and the character behind the voice. This will help to guide the process of creating a custom voice model, including defining the scripts, selecting your voice talent, training and voice tuning.
+
+### Script selection
+
+Carefully select the recording script to represent the user scenarios for your voice. For example, you can use the phrases from bot conversations as your recording script if you are creating a customer service bot. Include different sentence types in your scripts, including statements, questions, exclamations, etc.
+
+### Preparing training data
+
+We recommend that the audio recordings be captured in a professional quality recording studio to achieve a high signal-to-noise ratio. The quality of the voice model heavily depends on your training data. Consistent volume, speaking rate, pitch, and consistency in expressive mannerisms of speech are required.
+
+Once the recordings are ready, follow [Prepare data for your training](how-to-custom-voice-prepare-data.md) to prepare the training data in the right format.
+
+### Training
+
+Once you have prepared the training data, go to [Speech Studio](https://aka.ms/custom-voice) to create your custom neural voice. You need to select at least 300 utterances to create a custom neural voice. A series of data quality checks are automatically performed when you upload them. To build high quality voice models, you should fix the errors and submit again.
+
+### Testing
+
+Prepare test scripts for your voice model that cover the different use cases for your apps. ItΓÇÖs recommended that you use scripts within and outside the training dataset so you can test the quality more broadly for different content.
+
+### Tuning and adjustment
+
+The style and the characteristics of the trained voice model depend on the style and the quality of the recordings from the voice talent used for training. However, several adjustments can be made using [SSML (Speech Synthesis Markup Language)](/azure/cognitive-services/speech-service/speech-synthesis-markup?tabs=csharp) when you make the API calls to your voice model to generate synthetic speech. SSML is the markup language used to communicate with the TTS service to convert text into audio. The adjustments include change of pitch, rate, intonation, and pronunciation correction. If the voice model is built with multiple styles, SSML can also be used to switch the styles.
+
+## Migrate to Custom Neural Voice
+
+The standard/non-neural training tier (adaptive, statistical parametric, concacenative) of Custom Voice is being deprecated. The announcement has been sent out to all existing Speech subscriptions before 2/28/2021. During the deprecation period (3/1/2021 - 2/29/2024), existing standard tier users can continue to use their non-neural models created. All new users/new speech resources should move to the neural tier/Custom Neural Voice. After 2/29/2024, all standard/non-neural custom voices will no longer be supported.
If you are using non-neural/standard Custom Voice, migrate to Custom Neural Voice immediately following the steps below. Moving to Custom Neural Voice will help you develop more realistic voices for even more natural conversational interfaces and enable your customers and end users to benefit from the latest Text-to-Speech technology, in a responsible way. 1. Learn more about our [policy on the limit access](/legal/cognitive-services/speech-service/custom-neural-voice/limited-access-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) and [apply here](https://aka.ms/customneural). Note that the access to the Custom Neural Voice service is subject to MicrosoftΓÇÖs sole discretion based on our eligibility criteria. Customers may gain access to the technology only after their application is reviewed and they have committed to using it in alignment with our [Responsible AI principles](https://microsoft.com/ai/responsible-ai) and the [code of conduct](/legal/cognitive-services/speech-service/tts-code-of-conduct?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext). 2. Once your application is approved, you will be provided with the access to the "neural" training feature. Make sure you log in to the [Custom Voice portal](https://speech.microsoft.com/customvoice) using the same Azure subscription that you provide in your application. > [!IMPORTANT]
- > To protect voice talent and prevent training of voice models with unauthorized recording or without the acknowledgement from the voice talent, we require the customer to upload a recorded statement of the voice talent giving his or her consent. When preparing your recording script, make sure you include this sentence.
+ > To protect voice talent and prevent training of voice models with unauthorized recording or without the acknowledgement from the voice talent, we require the customer to upload a recorded statement of the voice talent giving their consent. When preparing your recording script, make sure you include this sentence.
> ΓÇ£I [state your first and last name] am aware that recordings of my voice will be used by [state the name of the company] to create and use a synthetic version of my voice.ΓÇ¥ > This sentence must be uploaded to the **Voice Talent** tab as a verbal consent file. It will be used to verify if the recordings in your training datasets are done by the same person that makes the consent. 3. After the Custom Neural Voice model is created, deploy the voice model to a new endpoint. To create a new custom voice endpoint with your neural voice model, go to **Text-to-Speech > Custom Voice > Deployment**. Select **Deploy model** and enter a **Name** and **Description** for your custom endpoint. Then select the custom neural voice model you would like to associate with this endpoint and confirm the deployment.
If you are using non-neural/standard Custom Voice, migrate to Custom Neural Voi
- [Prepare Custom Voice data](how-to-custom-voice-prepare-data.md) - [Create a Custom Voice](how-to-custom-voice-create-voice.md)-- [Guide: Record your voice samples](record-custom-voice-samples.md)
+- [Tutorial: Record your voice samples](record-custom-voice-samples.md)
cognitive-services Long Audio Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/long-audio-api.md
More benefits of the Long Audio API:
* There's no need to deploy a voice endpoint. > [!NOTE]
-> The Long Audio API supports both [Public Neural Voices](./language-support.md#neural-voices) and [Custom Neural Voices](./how-to-custom-voice.md#custom-neural-voices).
+> The Long Audio API supports both [Public Neural Voices](./language-support.md#neural-voices) and [Custom Neural Voices](./how-to-custom-voice.md).
## Workflow
cognitive-services Record Custom Voice Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/record-custom-voice-samples.md
# Record voice samples to create a custom voice
-Creating a high-quality production custom voice from scratch is not a casual undertaking. The central component of a custom voice is a large collection of audio samples of human speech. It's vital that these audio recordings be of high quality. Choose a voice talent who has experience making these kinds of recordings, and have them recorded by a competent recording engineer using professional equipment.
+Creating a high-quality production custom neural voice from scratch is not a casual undertaking. The central component of a custom voice is a large collection of audio samples of human speech. It's vital that these audio recordings be of high quality. Choose a voice talent who has experience making these kinds of recordings, and have them recorded by a recording engineer using professional equipment.
-Before you can make these recordings, though, you need a script: the words that will be spoken by your voice talent to create the audio samples. For best results, your script must have good phonetic coverage and sufficient variety to train the custom voice model.
+Before you can make these recordings, though, you need a script: the words that will be spoken by your voice talent to create the audio samples. For best results, your script must have good phonetic coverage and sufficient variety to train the custom neural voice model.
Many small but important details go into creating a professional voice recording. This guide is a roadmap for a process that will help you get good, consistent results. > [!NOTE]
-> If you would like to train a neural voice, you must specify a voice talent profile with the audio consent file provided of the voice talent acknowledging to use his/her speech data to train a custom voice model. When preparing your recording script, make sure you include the below sentence.
+> To train a neural voice, you must specify a voice talent profile with the audio consent file provided of the voice talent acknowledging to use his/her speech data to train a custom voice model. When preparing your recording script, make sure you include the below sentence.
> ΓÇ£I [state your first and last name] am aware that recordings of my voice will be used by [state the name of the company] to create and use a synthetic version of my voice.ΓÇ¥ This sentence will be used to verify if the training data is done by the same person that makes the consent. Read more about the [voice talent verification](/legal/cognitive-services/speech-service/custom-neural-voice/data-privacy-security-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) here.
This sentence will be used to verify if the training data is done by the same pe
> Custom Neural Voice is available with limited access. Make sure you understand the [responsible AI requirements](/legal/cognitive-services/speech-service/custom-neural-voice/limited-access-custom-neural-voice?context=%2fazure%2fcognitive-services%2fspeech-service%2fcontext%2fcontext) and [apply the access here](https://aka.ms/customneural). > [!TIP]
-> For the highest quality results, consider engaging Microsoft to help develop your custom voice. Microsoft has extensive experience producing high-quality voices for its own products, including Cortana and Office.
+> For the highest quality results, consider engaging Microsoft to help develop your custom neural voice. Microsoft has extensive experience producing high-quality voices for its own products, including Cortana and Office.
## Voice recording roles
-There are four basic roles in a custom voice recording project:
+There are four basic roles in a custom neural voice recording project:
Role|Purpose -|-
-Voice talent |This person's voice will form the basis of the custom voice.
+Voice talent |This person's voice will form the basis of the custom neural voice.
Recording engineer |Oversees the technical aspects of the recording and operates the recording equipment. Director |Prepares the script and coaches the voice talent's performance.
-Editor |Finalizes the audio files and prepares them for upload to the Custom Voice portal.
+Editor |Finalizes the audio files and prepares them for upload to Speech Studio
An individual may fill more than one role. This guide assumes that you will be primarily filling the director role and hiring both a voice talent and a recording engineer. If you want to make the recordings yourself, this article includes some information about the recording engineer role. The editor role isn't needed until after the session, so can be performed by the director or the recording engineer. ## Choose your voice talent
-Actors with experience in voiceover or voice character work make good custom voice talent. You can also often find suitable talent among announcers and newsreaders.
+Actors with experience in voiceover or voice character work make good custom neural voice talent. You can also often find suitable talent among announcers and newsreaders. Choose voice talent whose natural voice you like. It is possible to create unique "character" voices, but it's much harder for most talent to perform them consistently, and the effort can cause voice strain. The single most important factor for choosing voice talent is consistency. Your recordings should all sound like they were made on the same day in the same room. You can approach this ideal through good recording practices and engineering.
-Choose voice talent whose natural voice you like. It is possible to create unique "character" voices, but it's much harder for most talent to perform them consistently, and the effort can cause voice strain.
-
-> [!TIP]
-> Generally, avoid using recognizable voices to create a custom voiceΓÇöunless, of course, your goal is to produce a celebrity voice. Lesser-known voices are usually less distracting to users.
-
-The single most important factor for choosing voice talent is consistency. Your recordings should all sound like they were made on the same day in the same room. You can approach this ideal through good recording practices and engineering.
-
-Your voice talent is the other half of the equation. They must be able to speak with consistent rate, volume level, pitch, and tone. Clear diction is a must. The talent also needs to be able to strictly control their pitch variation, emotional affect, and speech mannerisms.
-
-Recording custom voice samples can be more fatiguing than other kinds of voice work. Most voice talent can record for two or three hours a day. Limit sessions to three or four a week, with a day off in-between if possible.
+Your voice talent is the other half of the equation. They must be able to speak with consistent rate, volume level, pitch, and tone. Clear diction is a must. The talent also needs to be able to strictly control their pitch variation, emotional affect, and speech mannerisms. Recording voice samples can be more fatiguing than other kinds of voice work. Most voice talent can record for two or three hours a day. Limit sessions to three or four a week, with a day off in-between if possible.
Work with your voice talent to develop a "persona" that defines the overall sound and emotional tone of the custom voice. In the process, you'll pinpoint what "neutral" sounds like for that persona. Using the Custom Neural Voice capability, you can train a model that speaks with emotions. Define the "speaking styles" and ask your voice talent to read the script in a way that resonate the styles you want.
A persona might have, for example, a naturally upbeat personality. So "their" vo
## Create a script
-The starting point of any custom voice recording session is the script, which contains the utterances to be spoken by your voice talent. (The term "utterances" encompasses both full sentences and shorter phrases.)
+The starting point of any custom neural voice recording session is the script, which contains the utterances to be spoken by your voice talent. (The term "utterances" encompasses both full sentences and shorter phrases.)
The utterances in your script can come from anywhere: fiction, non-fiction, transcripts of speeches, news reports, and anything else available in printed form. If you want to make sure your voice does well on specific kinds of words (such as medical terminology or programming jargon), you might want to include sentences from scholarly papers or technical documents. For a brief discussion of potential legal issues, see the ["Legalities"](#legalities) section. You can also write your own text.
-Your utterances don't need to come from the same source, or the same kind of source. They don't even need to have anything to do with each other. However, if you will use set phrases (for example, "You have successfully logged in") in your speech application, make sure to include them in your script. This will give your custom voice a better chance of pronouncing those phrases well. And if you should decide to use a recording in place of synthesized speech, you'll already have it in the same voice.
+Your utterances don't need to come from the same source, or the same kind of source. They don't even need to have anything to do with each other. However, if you will use set phrases (for example, "You have successfully logged in") in your speech application, make sure to include them in your script. This will give your custom neural voice a better chance of pronouncing those phrases well. And if you should decide to use a recording in place of synthesized speech, you'll already have it in the same voice.
While consistency is key in choosing voice talent, variety is the hallmark of a good script. Your script should include many different words and sentences with a variety of sentence lengths, structures, and moods. Every sound in the language should be represented multiple times and in numerous contexts (called *phonetic coverage*). Furthermore, the text should incorporate all the ways that a particular sound can be represented in writing, and place each sound at varying places in the sentences. Both declarative sentences and questions should be included and read with appropriate intonation.
-It's difficult to write a script that provides *just enough* data to allow the Custom Speech portal to build a good voice. In practice, the simplest way to make a script that achieves robust phonetic coverage is to include a large number of samples. The standard voices that Microsoft provides were built from tens of thousands of utterances. You should be prepared to record a few to several thousand utterances at minimum to build a production-quality custom voice.
+It's difficult to write a script that provides *just enough* data to allow Speech Studio to build a good voice. In practice, the simplest way to make a script that achieves robust phonetic coverage is to include a large number of samples. The standard voices that Microsoft provides were built from tens of thousands of utterances. You should be prepared to record a few to several thousand utterances at minimum to build a production-quality custom neural voice.
Check the script carefully for errors. If possible, have someone else check it too. When you run through the script with your talent, you'll probably catch a few more mistakes. ### Script format
-You can write your script in Microsoft Word. The script is for use during the recording session, so you can set it up any way you find easy to work with. Create the text file that's required by the Custom Voice portal separately.
+You can write your script in Microsoft Word. The script is for use during the recording session, so you can set it up any way you find easy to work with. Create the text file that's required by Speech Studio separately.
A basic script format contains three columns:
Print three copies of the script: one for the talent, one for the engineer, and
### Legalities
-Under copyright law, an actor's reading of copyrighted text might be a performance for which the author of the work should be compensated. This performance will not be recognizable in the final product, the custom voice. Even so, the legality of using a copyrighted work for this purpose is not well established. Microsoft cannot provide legal advice on this issue; consult your own counsel.
+Under copyright law, an actor's reading of copyrighted text might be a performance for which the author of the work should be compensated. This performance will not be recognizable in the final product, the custom neural voice. Even so, the legality of using a copyrighted work for this purpose is not well established. Microsoft cannot provide legal advice on this issue; consult your own counsel.
Fortunately, it is possible to avoid these issues entirely. There are many sources of text you can use without permission or license.
Fortunately, it is possible to avoid these issues entirely. There are many sourc
|Works no longer<br>under copyright|Typically works published prior to 1923. For English, [Project Gutenberg](https://www.gutenberg.org/) offers tens of thousands of such works. You may want to focus on newer works, as the language will be closer to modern English.| |Government&nbsp;works|Works created by the United States government are not copyrighted in the United States, though the government may claim copyright in other countries/regions.| |Public domain|Works for which copyright has been explicitly disclaimed or that have been dedicated to the public domain. It may not be possible to waive copyright entirely in some jurisdictions.|
-|Permissively-licensed works|Works distributed under a license like Creative Commons or the GNU Free Documentation License (GFDL). Wikipedia uses the GFDL. Some licenses, however, may impose restrictions on performance of the licensed content that may impact the creation of a custom voice model, so read the license carefully.|
+|Permissively-licensed works|Works distributed under a license like Creative Commons or the GNU Free Documentation License (GFDL). Wikipedia uses the GFDL. Some licenses, however, may impose restrictions on performance of the licensed content that may impact the creation of a custom neural voice model, so read the license carefully.|
## Recording your script
Here, most of the range (height) is used, but the highest peaks of the signal do
Record directly into the computer via a high-quality audio interface or a USB port, depending on the mic you're using. For analog, keep the audio chain simple: mic, preamp, audio interface, computer. You can license both [Avid Pro Tools](https://www.avid.com/en/pro-tools) and [Adobe Audition](https://www.adobe.com/products/audition.html) monthly at a reasonable cost. If your budget is extremely tight, try the free [Audacity](https://www.audacityteam.org/).
-Record at 44.1 kHz 16 bit monophonic (CD quality) or better. Current state-of-the-art is 48 kHz 24-bit, if your equipment supports it. You will down-sample your audio to 16 kHz 16-bit before you submit it to the Custom Voice portal. Still, it pays to have a high-quality original recording in the event edits are needed.
+Record at 44.1 kHz 16 bit monophonic (CD quality) or better. Current state-of-the-art is 48 kHz 24-bit, if your equipment supports it. You will down-sample your audio to 16 kHz 16-bit before you submit it to Speech Studio. Still, it pays to have a high-quality original recording in the event edits are needed.
Ideally, have different people serve in the roles of director, engineer, and talent. Don't try to do it all yourself. In a pinch, one person can be both the director and the engineer.
The match file is especially important when you resume recording after a break o
Coach your talent to take a deep breath and pause for a moment before each utterance. Record a couple of seconds of silence between utterances. Words should be pronounced the same way each time they appear, considering context. For example, "record" as a verb is pronounced differently from "record" as a noun.
-Record a good five seconds of silence before the first recording to capture the "room tone." This practice helps the Custom Voice portal compensate for any remaining noise in the recordings.
+Record approximately five seconds of silence before the first recording to capture the "room tone." This practice helps Speech Studio compensate for any remaining noise in the recordings.
> [!TIP]
-> All you really need to capture is the voice talent, so you can make a monophonic (single-channel) recording of just their lines. However, if you record in stereo, you can use the second channel to record the chatter in the control room to capture discussion of particular lines or takes. Remove this track from the version that's uploaded to the Custom Voice portal.
+> All you really need to capture is the voice talent, so you can make a monophonic (single-channel) recording of just their lines. However, if you record in stereo, you can use the second channel to record the chatter in the control room to capture discussion of particular lines or takes. Remove this track from the version that's uploaded to Speech Studio.
Listen closely, using headphones, to the voice talent's performance. You're looking for good but natural diction, correct pronunciation, and a lack of unwanted sounds. Don't hesitate to ask your talent to re-record an utterance that doesn't meet these standards. > [!TIP]
-> If you are using a large number of utterances, a single utterance might not have a noticeable effect on the resultant custom voice. It might be more expedient to simply note any utterances with issues, exclude them from your dataset, and see how your custom voice turns out. You can always go back to the studio and record the missed samples later.
+> If you are using a large number of utterances, a single utterance might not have a noticeable effect on the resultant custom neural voice. It might be more expedient to simply note any utterances with issues, exclude them from your dataset, and see how your custom neural voice turns out. You can always go back to the studio and record the missed samples later.
Note the take number or time code on your script for each utterance. Ask the engineer to mark each utterance in the recording's metadata or cue sheet as well.
Take regular breaks and provide a beverage to help your voice talent keep their
Modern recording studios run on computers. At the end of the session, you receive one or more audio files, not a tape. These files will probably be WAV or AIFF format in CD quality (44.1 kHz 16-bit) or better. 48 kHz 24-bit is common and desirable. Higher sampling rates, such as 96 kHz, are generally not needed.
-The Custom Voice portal requires each provided utterance to be in its own file. Each audio file delivered by the studio contains multiple utterances. So the primary post-production task is to split up the recordings and prepare them for submission. The recording engineer might have placed markers in the file (or provided a separate cue sheet) to indicate where each utterance starts.
+Speech Studio requires each provided utterance to be in its own file. Each audio file delivered by the studio contains multiple utterances. So the primary post-production task is to split up the recordings and prepare them for submission. The recording engineer might have placed markers in the file (or provided a separate cue sheet) to indicate where each utterance starts.
Use your notes to find the exact takes you want, and then use a sound editing utility, such as [Avid Pro Tools](https://www.avid.com/en/pro-tools), [Adobe Audition](https://www.adobe.com/products/audition.html), or the free [Audacity](https://www.audacityteam.org/), to copy each utterance into a new file.
-Leave only about 0.2 seconds of silence at the beginning and end of each clip, except for the first. That file should start with a full five seconds of silence. Do not use an audio editor to "zero out" silent parts of the file. Including the "room tone" will help the Custom Voice algorithms compensate for any residual background noise.
+Leave only about 0.2 seconds of silence at the beginning and end of each clip, except for the first. That file should start with a full five seconds of silence. Do not use an audio editor to "zero out" silent parts of the file. Including the "room tone" will help the algorithms compensate for any residual background noise.
Listen to each file carefully. At this stage, you can edit out small unwanted sounds that you missed during recording, like a slight lip smack before a line, but be careful not to remove any actual speech. If you can't fix a file, remove it from your dataset and note that you have done so.
-Convert each file to 16 bits and a sample rate of 16 kHz before saving and, if you recorded the studio chatter, remove the second channel. Save each file in WAV format, naming the files with the utterance number from your script.
+Convert each file to 16 bits and a sample rate of 24 kHz before saving and if you recorded the studio chatter, remove the second channel. Save each file in WAV format, naming the files with the utterance number from your script.
-Finally, create the *transcript* that associates each WAV file with a text version of the corresponding utterance. [Creating custom voices](./how-to-custom-voice-create-voice.md) includes details of the required format. You can copy the text directly from your script. Then create a Zip file of the WAV files and the text transcript.
+Finally, create the *transcript* that associates each WAV file with a text version of the corresponding utterance. [Create and use your voice model](./how-to-custom-voice-create-voice.md) includes details of the required format. You can copy the text directly from your script. Then create a Zip file of the WAV files and the text transcript.
Archive the original recordings in a safe place in case you need them later. Preserve your script and notes, too. ## Next steps
-You're ready to upload your recordings and create your custom voice.
+You're ready to upload your recordings and create your custom neural voice.
> [!div class="nextstepaction"]
-> [Create custom voice fonts](./how-to-custom-voice-create-voice.md)
+> [Create and use your voice model](./how-to-custom-voice-create-voice.md)
cognitive-services Releasenotes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/releasenotes.md
Previously updated : 04/20/2021 Last updated : 05/15/2021 # Speech Service release notes
+## Speech SDK 1.17.0: 2021-May release
+
+>[!NOTE]
+>Get started with the Speech SDK [here](https://docs.microsoft.com/azure/cognitive-services/speech-service/speech-sdk#get-the-speech-sdk).
+
+**Highlights summary**
+
+- Smaller footprint - we continue to decrease the memory and disk footprint of the Speech SDK and its components.
+- A new stand alone language detection API allows you to recognize what language is being spoken.
+- Develop speech enabled mixed reality and gaming applications using Unity on macOS.
+- You can now use text-to-speech in addition to speech recognition from the Go programming language.
+- Several Bug fixes to address issues YOU, our valued customers, have flagged on GitHub! THANK YOU! Keep the feedback coming!
+
+#### New features
+
+- **C++/C#**: New stand-alone Single-Shot/At-Start and Continuous Language Detection via the `SourceLanguageRecognizer` API. If you only want to detect the language(s) spoken in audio content, this is the API to do that.
+- **C++/C#**: Speech Recognition and Translation Recognition now support both single-shot and continuous Language Detection so you can programmatically determine which language(s) are being spoken before they are transcribed or translated. See documentation [here for Speech Recognition](https://docs.microsoft.com/azure/cognitive-services/speech-service/how-to-automatic-language-detection) and [here for Speech Translation](https://docs.microsoft.com/azure/cognitive-services/speech-service/get-started-speech-translation).
+- **C#**: Added support Unity support to macOS (x64). This unlocks speech recognition and speech synthesis use cases in mixed reality and gaming!
+- **Go**: We added support for speech synthesis/text-to-speech to the Go programming language to make speech synthesis available in even more use cases. See our [quickstart](https://docs.microsoft.com/azure/cognitive-services/speech-service/get-started-text-to-speech?tabs=windowsinstall&pivots=programming-language-go) or our [reference documentation](https://pkg.go.dev/github.com/Microsoft/cognitive-services-speech-sdk-go).
+- **C++/C#/Java/Python/Objective-C/Go**: The speech synthesizer now supports the `connection` object. This helps you manage and monitor the connection to the speech service, and is especially helpful to pre-connect to reduce latency. See documentation [here](https://docs.microsoft.com/azure/cognitive-services/speech-service/how-to-lower-tts-latency).
+- **C++/C#/Java/Python/Objective-C/Go**: We now expose the latency and underrun time in `SpeechSynthesisResult` to help you monitor and diagnose speech synthesis latency issues. See details for [C++](https://docs.microsoft.com/cpp/cognitive-services/speech/speechsynthesisresult), [C#](https://docs.microsoft.com/dotnet/api/microsoft.cognitiveservices.speech.speechsynthesisresult), [Java](https://docs.microsoft.com/java/api/com.microsoft.cognitiveservices.speech.speechsynthesisresult), [Python](https://docs.microsoft.com/python/api/azure-cognitiveservices-speech/azure.cognitiveservices.speech.speechsynthesisresult), [Objective-C](https://docs.microsoft.com/objectivec/cognitive-services/speech/spxspeechsynthesisresult) and [Go](https://pkg.go.dev/github.com/Microsoft/cognitive-services-speech-sdk-go#readme-reference).
+- **C++/C#/Java/Python/Objective-C/Go**: We added a Gender property to the synthesis voice info to make it easier to select voices based on gender. This addresses [GitHub issue #1055](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/1055).
+- **C++, C#, Java, JavaScript**: We now support `retrieveEnrollmentResultAsync`, `getAuthorizationPhrasesAsync` and `getAllProfilesAsync()` in Speaker Recognition to ease user management of all voice profiles for a given account. See documentation for [C++](https://docs.microsoft.com/cpp/cognitive-services/speech/voiceprofileclient), [C#](https://docs.microsoft.com/dotnet/api/microsoft.cognitiveservices.speech.voiceprofileclient?view=azure-dotnet), [Java](https://docs.microsoft.com/java/api/com.microsoft.cognitiveservices.speech.voiceprofileclient?view=azure-java-stable), [JavaScript](https://docs.microsoft.com/javascript/api/microsoft-cognitiveservices-speech-sdk/voiceprofileclient?view=azure-node-latest). This addresses [GitHub issue #338](https://github.com/microsoft/cognitive-services-speech-sdk-js/issues/338).
+- **JavaScript**: We added retry for connection failures that will make your JavaScript based speech applications more robust.
+
+#### Improvements
+
+- Linux and Android Speech SDK binaries have been updated to use the latest version of OpenSSL (1.1.1k)
+- Code Size improvements:
+ - Language Understanding is now split into a separate "lu" library.
+ - Windows x64 core binary size decreased by 14.4%.
+ - Android ARM64 core binary size decreased by 13.7%.
+ - other components also decreased in size.
+
+#### Bug fixes
+
+- **All**: Fixed [GitHub issue #842](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/842) for ServiceTimeout. You can now transcribe very long audio files using the Speech SDK without the connection to the service terminating with this error. However, we still recommend you use [batch transcription](https://docs.microsoft.com/azure/cognitive-services/speech-service/batch-transcription) for long files.
+- **C#**: Fixed [GitHub issue #947](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/947) where no speech input could leave your app in a bad state.
+- **Java**: Fixed [GitHub Issue #997](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/997) where the Java Speech SDK 1.16 crashes when using DialogServiceConnector without a network connection or an invalid subscription key.
+- Fixed a crash when abruptly stopping speech recognition (e.g. using CTRL+C on console app).
+- **Java**: Added a fix to delete temporary files on Windows when using Java Speech SDK.
+- **Java**: Fixed [GitHub issue #994](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/994) where calling `DialogServiceConnector.stopListeningAsync` could result in an error.
+- **Java**: Fixed a customer issue in the [virtual assistant quickstart](https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/java/jre/virtual-assistant).
+- **JavaScript**: Fixed [GitHub issue #366](https://github.com/microsoft/cognitive-services-speech-sdk-js/issues/366) where `ConversationTranslator` threw an error 'this.cancelSpeech is not a function'.
+- **JavaScript**: Fixed [GitHub issue #298](https://github.com/microsoft/cognitive-services-speech-sdk-js/issues/298) where 'Get result as an in-memory stream' sample played sound out loud.
+- **JavaScript**: Fixed [GitHub issue #350](https://github.com/microsoft/cognitive-services-speech-sdk-js/issues/350) where calling `AudioConfig` could result in a 'ReferenceError: MediaStream is not defined'.
+- **JavaScript**: Fixed an UnhandledPromiseRejection warning in Node.js for long-running sessions.
+
+#### Samples
+
+- Updated Unity samples documentation for macOS [here](https://github.com/Azure-Samples/cognitive-services-speech-sdk).
+- A React Native sample for the Cognitive Services speech recognition service is now available [here](https://github.com/microsoft/cognitive-services-sdk-react-native-example).
+
+## Speech CLI (also known as SPX): 2021-May release
+
+>[!NOTE]
+>Get started with the Azure Speech service command line interface (CLI) [here](https://docs.microsoft.com/azure/cognitive-services/speech-service/spx-basics). The CLI enables you to use the Azure Speech service without writing any code.
+
+#### New features
+
+- SPX now supports Profile, Speaker ID and Speaker verification - Try `spx profile` and `spx speaker` from the SPX command line.
+- We also added Dialog support - Try `spx dialog` from the SPX command line.
+- SPX help improvements. Please give us feedback about how this works for you by opening a [GitHub issue](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues?q=is%3Aissue+is%3Aopen).
+- We've decreased the size of the SPX .NET tool install.
+
+**COVID-19 abridged testing**:
+
+As the ongoing pandemic continues to require our engineers to work from home, pre-pandemic manual verification scripts have been significantly reduced. We test on fewer devices with fewer configurations, and the likelihood of environment-specific bugs slipping through may be increased. We still rigorously validate with a large set of automation. In the unlikely event that we missed something, please let us know on [GitHub](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues?q=is%3Aissue+is%3Aopen).<br>
+Stay healthy!
++ ## Text-to-speech 2021-April release **Neural TTS is available across 21 regions**
More samples have been added and are constantly being updated. For the latest se
## Cognitive Services Speech SDK 0.2.12733: 2018-May release
-This release is the first public preview release of the Cognitive Services Speech SDK.
+This release is the first public preview release of the Cognitive Services Speech SDK.
cognitive-services Speech Synthesis Markup https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/speech-synthesis-markup.md
As an example, you might want to know the time offset of each flower word as fol
You can subscribe to the `BookmarkReached` event in Speech SDK to get the bookmark offsets. > [!NOTE]
-> `BookmarkReached` event is only available since Speech SDK version 1.16.0.
+> `BookmarkReached` event is only available since Speech SDK version 1.16.
`BookmarkReached` events are raised as the output audio data becomes available, which will be faster than playback to an output device.
cognitive-services Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/text-to-speech.md
This documentation contains the following article types:
* Neural voices - Deep neural networks are used to overcome the limits of traditional speech synthesis with regard to stress and intonation in spoken language. Prosody prediction and voice synthesis are performed simultaneously, which results in more fluid and natural-sounding outputs. Neural voices can be used to make interactions with chatbots and voice assistants more natural and engaging, convert digital texts such as e-books into audiobooks, and enhance in-car navigation systems. With the human-like natural prosody and clear articulation of words, neural voices significantly reduce listening fatigue when you interact with AI systems. For a full list of neural voices, see [supported languages](language-support.md#text-to-speech).
-* Adjust speaking styles with SSML - Speech Synthesis Markup Language (SSML) is an XML-based markup language used to customize speech-to-text outputs. With SSML, you can adjust pitch, add pauses, improve pronunciation, speed up or slow down speaking rate, increase or decrease volume, and attribute multiple voices to a single document. See the [how-to](speech-synthesis-markup.md) for adjusting speaking styles.
+* Adjust speaking styles with SSML - Speech Synthesis Markup Language (SSML) is an XML-based markup language used to customize speech-to-text outputs. With SSML, you can adjust pitch, add pauses, improve pronunciation, change speaking rate, adjust volume, and attribute multiple voices to a single document. See the [how-to](speech-synthesis-markup.md) for adjusting speaking styles.
* Visemes - [Visemes](how-to-speech-synthesis-viseme.md) are the key poses in observed speech, including the position of the lips, jaw and tongue when producing a particular phoneme. Visemes have a strong correlation with voices and phonemes. Using viseme events in Speech SDK, you can generate facial animation data, which can be used to animate faces in lip-reading communication, education, entertainment, and customer service. Viseme is currently only supported for the `en-US` English (United States) [neural voices](language-support.md#text-to-speech).
Sample code for text-to-speech is available on GitHub. These samples cover text-
## Customization
-In addition to neural voices, you can create and fine-tune custom voices unique to your product or brand. All it takes to get started are a handful of audio files and the associated transcriptions. For more information, see [Get started with Custom Voice](how-to-custom-voice.md)
+In addition to neural voices, you can create and fine-tune custom voices unique to your product or brand. All it takes to get started are a handful of audio files and the associated transcriptions. For more information, see [Get started with Custom Neural Voice](how-to-custom-voice.md)
## Pricing note
cognitive-services Character Counts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/character-counts.md
- Title: Character Counts - Translator-
-description: This article explains how the Azure Cognitive Services Translator counts characters so you can understand how it ingests content.
------ Previously updated : 05/26/2020---
-# How the Translator counts characters
-
-The Translator counts every unicode code point of input text as a character. Each translation of a text to a language counts as a separate translation, even if the request was made in a single API call translating to multiple languages. The length of the response does not matter.
-
-What counts is:
-
-* Text passed to Translator in the body of the request
- * `Text` when using the Translate, Transliterate, and Dictionary Lookup methods
- * `Text` and `Translation` when using the Dictionary Examples method
-* All markup: HTML, XML tags, etc. within the text field of the request body. JSON notation used to build the request (for instance "Text:") is not counted.
-* An individual letter
-* Punctuation
-* A space, tab, markup, and any kind of white space character
-* Every code point defined in Unicode
-* A repeated translation, even if you have translated the same text previously
-
-For scripts based on ideograms such as Chinese and Japanese Kanji, the Translator service still counts the number of Unicode code points, one character per ideogram. Exception: Unicode surrogates count as two characters.
-
-The number of requests, words, bytes, or sentences is irrelevant in the character count.
-
-Calls to the Detect and BreakSentence methods are not counted in the character consumption. However, we do expect that the calls to the Detect and BreakSentence methods are in a reasonable proportion to the use of other functions that are counted. If the number of Detect or BreakSentence calls you make exceeds the number of other counted methods by 100 times, Microsoft reserves the right to restrict your use of the Detect and BreakSentence methods.
-
-Every character submitted to the translate function is counted even when the content is not changed or when the source and target language are the same.
-
-More information about character counts is in the [Translator FAQ](https://www.microsoft.com/en-us/translator/faq.aspx).
cognitive-services Get Documents Status https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/document-translation/reference/get-documents-status.md
Request parameters passed on the query string are:
|Query parameter|In|Required|Type|Description| | | | | | | |id|path|True|string|The operation ID.|
-|endpoint|path|True|string|Supported Cognitive Services endpoints (protocol and hostname, for example: `https://westus.api.cognitive.microsoft.com`).|
|$maxpagesize|query|False|integer int32|$maxpagesize is the maximum items returned in a page. If more items are requested via $top (or $top is not specified and there are more items to be returned), @nextLink will contain the link to the next page. Clients MAY request server-driven paging with a specific page size by specifying a $maxpagesize preference. The server SHOULD honor this preference if the specified page size is smaller than the server's default page size.| |$orderBy|query|False|array|The sorting query for the collection (ex: 'CreatedDateTimeUtc asc', 'CreatedDateTimeUtc desc').| |$skip|query|False|integer int32|$skip indicates the number of records to skip from the list of records held by the server based on the sorting method specified. By default, we sort by descending start time. Clients MAY use $top and $skip query parameters to specify a number of results to return and an offset into the collection. When both $top and $skip are given by a client, the server SHOULD first apply $skip and then $top on the collection. Note: If the server can't honor $top and/or $skip, the server MUST return an error to the client informing about it instead of just ignoring the query options.|
cognitive-services Quickstart Translator https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/quickstart-translator.md
If you're encountering connection issues, it may be that your SSL certificate ha
## Next steps
-* [Learn how the API counts characters](character-counts.md)
-* [Customize and improve translation](customization.md)
-
-## See also
-
-* [Translator v3 API reference](reference/v3-0-reference.md)
-* [Language support](language-support.md)
+> [!div class="nextstepaction"]
+> [Customize and improve translation](customization.md)
cognitive-services Translator Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/translator-faq.md
+
+ Title: Frequently asked questions - Translator
+
+description: Get answers to frequently asked questions about the Translator API in Azure Cognitive Services.
+++++++ Last updated : 05/18/2021+++
+# Frequently asked questionsΓÇöTranslator API
+
+## How does Translator count characters?
+
+Translator counts every code point defined in Unicode as a character. Each translation counts as a separate translation, even if the request was made in a single API call translating to multiple languages. The length of the response doesn't matter and the number of requests, words, bytes, or sentences isn't relevant to character count.
+
+Translator counts the following input:
+
+* Text passed to Translator in the body of a request.
+ * `Text` when using the [Translate](reference/v3-0-translate.md), [Transliterate](reference/v3-0-transliterate.md), and [Dictionary Lookup](reference/v3-0-dictionary-lookup.md) methods
+ * `Text` and `Translation` when using the [Dictionary Examples](reference/v3-0-dictionary-examples.md) method.
+
+* All markup: HTML, XML tags, etc. within the request body text field. JSON notation used to build the request (for instance the key "Text:") is **not** counted.
+* An individual letter.
+* Punctuation.
+* A space, tab, markup, or any white-space character.
+* A repeated translation, even if you've previously translated the same text. Every character submitted to the translate function is counted even when the content is unchanged or the source and target language are the same.
+
+For scripts based on graphic symbols, such as written Chinese and Japanese Kanji, the Translator service counts the number of Unicode code points. One character per symbol. Exception: Unicode surrogate pairs count as two characters.
+
+Calls to the **Detect** and **BreakSentence** methods aren't counted in the character consumption. However, we do expect calls to the Detect and BreakSentence methods to be reasonably proportionate to the use of other counted functions. If the number of Detect or BreakSentence calls exceeds the number of other counted methods by 100 times, Microsoft reserves the right to restrict your use of the Detect and BreakSentence methods.
+
+## Where can I see my monthly usage?
+
+The [Azure pricing calculator](https://azure.microsoft.com/pricing/calculator/) can be used to estimate your costs. You can also monitor, view, and add Azure alerts for your Azure services in your user account in the Azure portal:
+
+1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Navigate to your Translator resource Overview page.
+1. Select the **subscription** for your Translator resource.
+
+ :::image type="content" source="media/azure-portal-overview.png" alt-text="Screenshot of the subscription link on overview page in the Azure portal.":::
+
+2. In the left rail, make your selection under **Cost Management**:
+
+ :::image type="content" source="media/azure-portal-cost-management.png" alt-text="Screenshot of the cost management resources links in the Azure portal.":::
+
+## Is attribution required when using Translator?
+
+Attribution isn't required when using Translator for text and speech translation. It is recommended that you inform users that the content they're viewing is machine translated.
+
+If attribution is present, it must conform to the [Translator attribution guidelines](https://www.microsoft.com/translator/business/attribution/).
+
+## Is Translator a replacement for human translator?
+
+No, both have their place as essential tools for communication. Use machine translation where the quantity of content, speed of creation, and budget constraints make it impossible to use human translation.
+
+Machine translation has been used as a first pass by several of our [language service provider (LSP)](https://www.microsoft.com/translator/business/partners/) partners, prior to using human translation and can improve productivity by up to 50 percent. For a list of LSP partners, visit the Translator partner page.
++
+> [!TIP]
+> If you can't find answers to your questions in this FAQ, try asking the Translator API community on [StackOverflow](https://stackoverflow.com/search?q=%5Bmicrosoft-cognitive%5D+or+%5Bmicrosoft-cognitive%5D+translator&s=34bf0ce2-b6b3-4355-86a6-d45a1121fe27).
cognitive-services Translator Info Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/translator-info-overview.md
keywords: translator, text translation, machine translation, translation service
Translator is a cloud-based machine translation service and is part of the [Azure Cognitive Services](../../index.yml?panel=ai&pivot=products) family of cognitive APIs used to build intelligent apps. Translator is easy to integrate in your applications, websites, tools, and solutions. It allows you to add multi-language user experiences in [90 languages and dialects](./language-support.md) and can be used for text translation with any operating system.
-This documentation contains the following article types:
-
-* [**Quickstarts**](quickstart-translator.md) are getting-started instructions to guide you through making requests to the service.
-* [**How-to guides**](translator-how-to-signup.md) contain instructions for using the service in more specific or customized ways.
-* [**Concepts**](character-counts.md) provide in-depth explanations of the service functionality and features.
-* [**Tutorials**](tutorial-wpf-translation-csharp.md) are longer guides that show you how to use the service as a component in broader business solutions.
+This documentation contains the following article types:
+* [**Quickstarts**](quickstart-translator.md) are getting-started instructions to guide you through making requests to the service.
+* [**How-to guides**](translator-how-to-signup.md) contain instructions for using the service in more specific or customized ways.
+* [**Tutorials**](tutorial-wpf-translation-csharp.md) are longer guides that show you how to use the service as a component in broader business solutions.
## About Microsoft Translator Translator powers many Microsoft products and services, and is used by thousands of businesses worldwide in their applications and workflows.
-Speech translation, powered by Translator, is also available through the [Azure Speech service](../speech-service/index.yml). It combines functionality from the Translator Speech API and the Custom Speech Service into a unified and fully customizable service. 
+Speech translation, powered by Translator, is also available through the [Azure Speech service](../speech-service/index.yml). It combines functionality from the Translator Speech API and the Custom Speech Service into a unified and fully customizable service.
## Language support
-Translator provides multi-language support for text translation, transliteration, language detection, and dictionaries. See [language support](language-support.md) for a complete list, or access the list programmatically with the [REST API](./reference/v3-0-languages.md).
+Translator provides multi-language support for text translation, transliteration, language detection, and dictionaries. See [language support](language-support.md) for a complete list, or access the list programmatically with the [REST API](./reference/v3-0-languages.md).
## Microsoft Translator Neural Machine Translation
NMT provides better translations than SMT not only from a raw translation qualit
NMT models are at the core of the API and are not visible to end users. The only noticeable difference is improved translation quality, especially for languages such as Chinese, Japanese, and Arabic.
-Learn more about [how NMT works](https://www.microsoft.com/en-us/translator/mt.aspx#nnt).
+Learn more about [how NMT works](https://www.microsoft.com/translator/mt.aspx#nnt).
## Improve translations with Custom Translator
With Custom Translator, you can build translation systems to handle the terminol
## Next steps -- [Create a Translator service](./translator-how-to-signup.md) to get your access keys and endpoint.-- Try our [Quickstart](quickstart-translator.md) to quickly call the Translator service.-- [API reference](./reference/v3-0-reference.md) provides the technical documentation for the APIs.-- [Pricing details](https://azure.microsoft.com/pricing/details/cognitive-services/translator-text-api/)
+* [Create a Translator service](./translator-how-to-signup.md) to get your access keys and endpoint.
+* Try our [Quickstart](quickstart-translator.md) to quickly call the Translator service.
+* [API reference](./reference/v3-0-reference.md) provides the technical documentation for the APIs.
+* [Pricing details](https://azure.microsoft.com/pricing/details/cognitive-services/translator-text-api/)
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/whats-new.md
+
+ Title: What's new in Translator?
+
+description: Learn of the latest changes to the Translator Service API.
+++++ Last updated : 05/18/2021+++
+<!-- markdownlint-disable MD024 -->
+<!-- markdownlint-disable MD036 -->
+# What's new in Translator
+
+Review the latest updates to the text Translator service. Bookmark this page to stay up to date with release notes, feature enhancements, and documentation updates.
+
+## February 2021
+
+### [Document Translation public preview](https://www.microsoft.com/translator/blog/2021/02/17/introducing-document-translation/)
+
+* **New release**: [Document Translation](document-translation/overview.md) is available as a preview feature of the Translator Service. Preview features are still in development and aren't meant for production use. They're made available on a "preview" basis so customers can get early access and provide feedback. Document Translation enables you to translate large documents and process batch files while still preserving the original structure and format. _See_ [Microsoft Translator blog: Introducing Document Translation](https://www.microsoft.com/translator/blog/2021/02/17/introducing-document-translation/)
+
+### [Text translation support for 9 added languages](https://www.microsoft.com/translator/blog/2021/02/22/microsoft-translator-releases-nine-new-languages-for-international-mother-language-day-2021/)
+
+* Translator service has [text translation language support](language-support.md#text-translation) for the following languages:
+
+ * **Albanian**. An isolate language unrelated to any other and spoken by nearly 8 million people.
+ * **Amharic**. An official language of Ethiopia spoken by approximately 32 million people. It's also the liturgical language of the Ethiopian Orthodox church.
+ * **Armenian**. The official language of Armenia with 5-7 million speakers.
+ * **Azerbaijani**. A Turkic language spoken by approximately 23 million people.
+ * **Khmer**. The official language of Cambodia with approximately 16 million speakers.
+ * **Lao**. The official language of Laos with 30 million native speakers.
+ * **Myanmar**. The official language of Myanmar, spoken as a first language by approximately 33 million people.
+ * **Nepali**. The official language of Nepal with approximately 16 million native speakers.
+ * **Tigrinya**. A language spoken in Eritrea and northern Ethiopia with nearly 11 million speakers.
+
+## January 2021
+
+### [Text translation support for Inuktitut](https://www.microsoft.com/translator/blog/2021/01/27/inuktitut-is-now-available-in-microsoft-translator/)
+
+* Translator service has [text translation language support](language-support.md#text-translation) for **Inuktitut**, one of the principal Inuit languages of Canada. Inuktitut is one of eight official aboriginal languages in the Northwest Territories.
+
+## November 2020
+
+### [Custom Translator V2 is generally available](https://www.microsoft.com/translator/blog/2021/01/27/inuktitut-is-now-available-in-microsoft-translator/)
+
+* **New release**: Custom Translator V2 upgrade is fully available to the generally available (GA). The V2 platform enables you to build custom models with all document types (training, testing, tuning, phrase dictionary, and sentence dictionary). _See_ [Microsoft Translator blog: Custom Translator pushes the translation quality bar closer to human parity](https://www.microsoft.com/translator/blog/2020/11/12/microsoft-custom-translator-pushes-the-translation-quality-bar-closer-to-human-parity).
+
+## October 2020
+
+### [Text translation support for Canadian French](https://www.microsoft.com/translator/blog/2020/10/20/cest-tiguidou-ca-translator-adds-canadian-french/)
+
+* Translator service has [text translation language support](language-support.md#text-translation) for **Canadian French**. Canadian French and European French are similar to one another and are mutually understandable. However, there can be significant differences in vocabulary, grammar, writing, and pronunciation. Over 7 million Canadians (20 percent of the population) speak French as their first language.
+
+## September 2020
+
+### [Text translation support for Assamese and Axomiya](https://www.microsoft.com/translator/blog/2020/09/29/assamese-text-translation-is-here/)
+
+* Translator service has [text translation language support](language-support.md#text-translation) for **Assamese** also knows as **Axomiya**. Assamese / Axomiya is primarily spoken in Eastern India by approximately 14 million people.
+
+## August 2020
+
+### [Introducing virtual networks and private links for translator](https://www.microsoft.com/translator/blog/2020/08/19/virtual-networks-and-private-links-for-translator-are-now-generally-available/)
+
+* **New release**: Virtual network capabilities and Azure private links for Translator are generally available (GA). Azure private links allow you to access Translator and your Azure hosted services over a private endpoint in your virtual network. You can use private endpoints for Translator to allow clients on a virtual network to securely access data over a private link. _See_ [Microsoft Translator blog: Virtual Networks and Private Links for Translator are generally available](https://www.microsoft.com/translator/blog/2020/08/19/virtual-networks-and-private-links-for-translator-are-now-generally-available/)
+
+### [Custom Translator upgrade to v2](https://www.microsoft.com/translator/blog/2020/08/05/custom-translator-v2-is-now-available/)
+
+* **New release**: Custom Translator V2 phase 1 is available. The newest version of Custom Translator will roll out in two phases to provide quicker translation and quality improvements, and allow you to keep your training data in the region of your choice. *See* [Microsoft Translator blog: Custom Translator: Introducing higher quality translations and regional data residency](https://www.microsoft.com/translator/blog/2020/08/05/custom-translator-v2-is-now-available/)
+
+### [Text translation support for two Kurdish dialects](https://www.microsoft.com/translator/blog/2020/08/20/translator-adds-two-kurdish-dialects-for-text-translation/)
+
+* **Northern (Kurmanji) Kurdish** (15 million native speakers) and **Central (Sorani) Kurdish** (7 million native speakers). Most Kurdish texts are written in Kurmanji and Sorani.
+
+### [Text translation support for two Afghan languages](https://www.microsoft.com/translator/blog/2020/08/17/translator-adds-dari-and-pashto-text-translation/)
+
+* **Dari** (20 million native speakers) and **Pashto** (40 - 60 million speakers). The two official languages of Afghanistan.
+
+### [Text translation support for Odia](https://www.microsoft.com/translator/blog/2020/08/13/odia-language-text-translation-is-now-available-in-microsoft-translator/)
+
+* **Odia** is a classical language spoken by 35 million people in India and across the world. It joins **Bangla**, **Gujarati**, **Hindi**, **Kannada**, **Malayalam**, **Marathi**, **Punjabi**, **Tamil**, **Telugu**, **Urdu**, and **English** as the twelfth most used language of India supported by Microsoft Translator.
cognitive-services Cognitive Services Security https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/cognitive-services-security.md
NSString* value =
[Customer Lockbox for Microsoft Azure](../security/fundamentals/customer-lockbox-overview.md) provides an interface for customers to review, and approve or reject customer data access requests. It is used in cases where a Microsoft engineer needs to access customer data during a support request. For information on how Customer Lockbox requests are initiated, tracked, and stored for later reviews and audits, see [Customer Lockbox](../security/fundamentals/customer-lockbox-overview.md).
-Customer Lockbox is available for this Cognitive Service:
+Customer Lockbox is available for this service:
* Translator
cognitive-services Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/quickstarts/label-tool.md
In v2.1, if your training document does not have a value filled in, you can draw
Next, you'll create tags (labels) and apply them to the text elements that you want the model to analyze.
-### [v2.0](#tab/v2-1)
+### [v2.1 preview](#tab/v2-1)
1. First, use the tags editor pane to create the tags you'd like to identify. 1. Select **+** to create a new tag.
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/whats-new.md
Title: What's new in Form Recognizer?
-description: Understand the latest changes to the Form Recognizer API.
+description: Learn the latest changes and updates to the Form Recognizer Service API.
- Previously updated : 04/28/2021+ Last updated : 04/14/2021 <!-- markdownlint-disable MD024 --> <!-- markdownlint-disable MD036 -->
-# What's new in Form Recognizer?
+# What's new in Form Recognizer
-The Form Recognizer service is updated on an ongoing basis. Use this article to stay up to date with feature enhancements, fixes, and documentation updates.
+Learn what's new in the Form Recognizer service. Bookmark this page to stay up-to-date with release notes, feature enhancements, and documentation updates.
## April 2021 <!-- markdownlint-disable MD029 -->
cognitive-services Tutorial Use Personalizer Chat Bot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/personalizer/tutorial-use-personalizer-chat-bot.md
description: Customize a C# .NET chat bot with a Personalizer loop to provide th
Previously updated : 07/17/2020 Last updated : 05/17/2021
git clone https://github.com/Azure-Samples/cognitive-services-personalizer-sampl
To use this chat bot, you need to create Azure resources for Personalizer and Language Understanding (LUIS).
-* [Create LUIS resources](../luis/luis-how-to-azure-subscription.md#create-luis-resources-in-the-azure-portal). Select **both** in the creation step because you need both authoring and prediction resources.
+* [Create LUIS resources](../luis/luis-how-to-azure-subscription.md). Create both an authoring and prediction resource.
* [Create Personalizer resource](how-to-create-resource.md) then copy the key and endpoint from the Azure portal. You will need to set these values in the `appsettings.json` file of the .NET project. ### Create LUIS app
cognitive-services Text Offsets https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/concepts/text-offsets.md
Previously updated : 03/09/2020 Last updated : 05/18/2021
Because of the different lengths of possible multilingual and emoji encodings, t
## Offsets in the API response.
-Whenever offsets are returned the API response, such as [Named Entity Recognition](../how-tos/text-analytics-how-to-entity-linking.md) or [Sentiment Analysis](../how-tos/text-analytics-how-to-sentiment-analysis.md), remember the following:
+Whenever offsets are returned the API response, such as [Named Entity Recognition](../how-tos/text-analytics-how-to-entity-linking.md) or [Sentiment Analysis](../how-tos/text-analytics-how-to-sentiment-analysis.md), remember:
* Elements in the response may be specific to the endpoint that was called. * HTTP POST/GET payloads are encoded in [UTF-8](https://www.w3schools.com/charsets/ref_html_utf8.asp), which may or may not be the default character encoding on your client-side compiler or operating system.
The Text Analytics API returns these textual elements as well, for convenience.
## Offsets in API version 3.1-preview
-Beginning with API version 3.1-preview.1, all Text Analytics API endpoints that return an offset will support the `stringIndexType` parameter. This parameter adjusts the `offset` and `length` attributes in the API output to match the requested string iteration scheme. Currently, we support three types:
+In version 3.1 of the API, all Text Analytics API endpoints that return an offset will support the `stringIndexType` parameter. This parameter adjusts the `offset` and `length` attributes in the API output to match the requested string iteration scheme. Currently, we support three types:
1. `textElement_v8` (default): iterates over graphemes as defined by the [Unicode 8.0.0](https://unicode.org/versions/Unicode8.0.0) standard 2. `unicodeCodePoint`: iterates over [Unicode Code Points](http://www.unicode.org/versions/Unicode13.0.0/ch02.pdf#G25564), the default scheme for Python 3
-3. `utf16CodeUnit`: iterates over [UTF-16 Code Units](https://unicode.org/faq/utf_bom.html#UTF16), the default scheme for Javascript, Java, and .NET
+3. `utf16CodeUnit`: iterates over [UTF-16 Code Units](https://unicode.org/faq/utf_bom.html#UTF16), the default scheme for JavaScript, Java, and .NET
If the `stringIndexType` requested matches the programming environment of choice, substring extraction can be done using standard substring or slice methods.
cognitive-services Text Analytics For Health https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-for-health.md
Previously updated : 03/11/2021 Last updated : 05/12/2021 - # How to: Use Text Analytics for health (preview) > [!IMPORTANT]
-> Text Analytics for health is a preview capability provided ΓÇ£AS ISΓÇ¥ and ΓÇ£WITH ALL FAULTS.ΓÇ¥ As such, **Text Analytics for health (preview) should not be implemented or deployed in any production use.**
-Text Analytics for health is not intended or made available for use as a medical device, clinical support, diagnostic tool, or other technology intended to be used in the diagnosis, cure, mitigation, treatment, or prevention of disease or other conditions, and no license or right is granted by Microsoft to use this capability for such purposes. This capability is not designed or intended to be implemented or deployed as a substitute for professional medical advice or healthcare opinion, diagnosis, treatment, or the clinical judgment of a healthcare professional, and should not be used as such. The customer is solely responsible for any use of Text Analytics for health. Microsoft does not warrant that Text Analytics for health or any materials provided in connection with the capability will be sufficient for any medical purposes or otherwise meet the health or medical requirements of any person.
+> Text Analytics for health is a preview capability provided ΓÇ£AS ISΓÇ¥ and ΓÇ£WITH ALL FAULTS.ΓÇ¥ As such, Text Analytics for health (preview) should not be implemented or deployed in any production use. Text Analytics for health is not intended or made available for use as a medical device, clinical support, diagnostic tool, or other technology intended to be used in the diagnosis, cure, mitigation, treatment, or prevention of disease or other conditions, and no license or right is granted by Microsoft to use this capability for such purposes. This capability is not designed or intended to be implemented or deployed as a substitute for professional medical advice or healthcare opinion, diagnosis, treatment, or the clinical judgment of a healthcare professional, and should not be used as such. The customer is solely responsible for any use of Text Analytics for health. The Customer must separately license any and all source vocabularies it intends to use under the terms set for that [UMLS Metathesaurus License Agreement Appendix](https://www.nlm.nih.gov/research/umls/knowledge_sources/metathesaurus/release/license_agreement_appendix.html) or any future equivalent link. The Customer is responsible for ensuring compliance with those license terms, including any geographic or other applicable restrictions.
Text Analytics for health is a feature of the Text Analytics API service that extracts and labels relevant medical information from unstructured texts such as doctor's notes, discharge summaries, clinical documents, and electronic health records. There are two ways to utilize this service:
The meaning of medical content is highly affected by modifiers, such as negative
See the [entity categories](../named-entity-types.md?tabs=health) returned by Text Analytics for health for a full list of supported entities. For information on confidence scores, see the [Text Analytics transparency note](/legal/cognitive-services/text-analytics/transparency-note#general-guidelines-to-understand-and-improve-performance?context=/azure/cognitive-services/text-analytics/context/context).
-### Supported languages and regions
+### Supported languages
Text Analytics for health only supports English language documents.
-The Text Analytics for health hosted web API is currently only available in these regions: West US 2, East US 2, Central US, North Europe and West Europe.
- ## Request access to the public preview Fill out and submit the [Cognitive Services request form](https://aka.ms/csgate) to request access to the Text Analytics for health public preview. You will not be billed for Text Analytics for health usage.
Document size must be under 5,120 characters per document. For the maximum numbe
### Structure the API request for the hosted asynchronous web API
-For both the container and hosted web API, you must create a POST request. You can [use Postman](text-analytics-how-to-call-api.md), a cURL command or the **API testing console** in the [Text Analytics for health hosted API reference](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/Health) to quickly construct and send a POST request to the hosted web API in your desired region.
-
-> [!NOTE]
-> Both the asynchronous `/analyze` and `/health` endpoints are only available in the following regions: West US 2, East US 2, Central US, North Europe and West Europe. To make successful requests to these endpoints, please make sure your resource is created in one of these regions.
+For both the container and hosted web API, you must create a POST request. You can [use Postman](text-analytics-how-to-call-api.md), a cURL command or the **API testing console** in the [Text Analytics for health hosted API reference](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/Health) to quickly construct and send a POST request to the hosted web API in your desired region. In the API v3.1-preview.5 endpoint, the `loggingOptOut` boolean query parameter can be used to enable logging for troubleshooting purposes. It's default is TRUE if not specified in the request query.
Below is an example of a JSON file attached to the Text Analytics for health API request's POST body:
example.json
Since this POST request is used to submit a job for the asynchronous operation, there is no text in the response object. However, you need the value of the operation-location KEY in the response headers to make a GET request to check the status of the job and the output. Below is an example of the value of the operation-location KEY in the response header of the POST request:
-`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/entities/health/jobs/<jobID>`
+`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/health/jobs/<jobID>`
To check the job status, make a GET request to the URL in the value of the operation-location KEY header of the POST response. The following states are used to reflect the status of a job: `NotStarted`, `running`, `succeeded`, `failed`, `rejected`, `cancelling`, and `cancelled`.
-You can cancel a job with a `NotStarted` or `running` status with a DELETE HTTP call to the same URL as the GET request. More information on the DELETE call is available in the [Text Analytics for health hosted API reference](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/CancelHealthJob).
+You can cancel a job with a `NotStarted` or `running` status with a DELETE HTTP call to the same URL as the GET request. More information on the DELETE call is available in the [Text Analytics for health hosted API reference](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/CancelHealthJob).
The following is an example of the response of a GET request. The output is available for retrieval until the `expirationDateTime` (24 hours from the time the job was created) has passed after which the output is purged.
The following is an example of the response of a GET request. The output is ava
You can [use Postman](text-analytics-how-to-call-api.md) or the example cURL request below to submit a query to the container you deployed, replacing the `serverURL` variable with the appropriate value. Note the version of the API in the URL for the container is different than the hosted API. ```bash
-curl -X POST 'http://<serverURL>:5000/text/analytics/v3.2-preview.1/entities/health' --header 'Content-Type: application/json' --header 'accept: application/json' --data-binary @example.json
+curl -X POST 'http://<serverURL>:5000/text/analytics/v3.1-preview.5/entities/health' --header 'Content-Type: application/json' --header 'accept: application/json' --data-binary @example.json
```
cognitive-services Text Analytics How To Call Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-call-api.md
Previously updated : 03/01/2021 Last updated : 05/18/2021 - # How to call the Text Analytics REST API
Before you use the Text Analytics API, you will need to create a Azure resource
1. First, go to the [Azure portal](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesTextAnalytics) and create a new Text Analytics resource, if you don't have one already. Choose a [pricing tier](https://azure.microsoft.com/pricing/details/cognitive-services/text-analytics/).
-2. Select the region you want to use for your endpoint. Please note the `/analyze` and `/health` endpoints are only available in the following regions: West US 2, East US 2, Central US, North Europe and West Europe.
+2. Select the region you want to use for your endpoint.
3. Create the Text Analytics resource and go to the ΓÇ£keys and endpoint bladeΓÇ¥ in the left of the page. Copy the key to be used later when you call the APIs. You'll add this later as a value for the `Ocp-Apim-Subscription-Key` header.
Before you use the Text Analytics API, you will need to create a Azure resource
2. Click **Metrics**, located under **Monitoring** in the left navigation menu. 3. Select *Processed text records* in the dropdown box for **Metric**.
-A text record is 1000 characters.
+A text record is a unit of input text up to 1000 characters. For example, 1500 characters submitted as input text will count as 2 text records.
## Change your pricing tier
You can call Text Analytics synchronously (for low latency scenarios). You have
## Using the API asynchronously
-Starting in v3.1-preview.3, the Text Analytics API provides two asynchronous endpoints:
+The Text Analytics v3.1-preview.5 API provides two asynchronous endpoints:
* The `/analyze` endpoint for Text Analytics allows you to analyze the same set of text documents with multiple text analytics features in one API call. Previously, to use multiple features you would need to make separate API calls for each operation. Consider this capability when you need to analyze large sets of documents with more than one Text Analytics feature. * The `/health` endpoint for Text Analytics for health, which can extract and label relevant medical information from clinical documents.
-Please note the /analyze and /health endpoints are only available in the following regions: West US 2, East US 2, Central US, North Europe and West Europe.
- See the table below to see which features can be used asynchronously. Note that only a few features can be called from the `/analyze` endpoint. | Feature | Synchronous | Asynchronous | |--|--|--| | Language detection | Γ£ö | |
-| Sentiment analysis | Γ£ö | |
-| Opinion mining | Γ£ö | |
+| Sentiment analysis | Γ£ö | Γ£ö* |
+| Opinion mining | Γ£ö | Γ£ö* |
| Key phrase extraction | Γ£ö | Γ£ö* | | Named Entity Recognition (including PII and PHI) | Γ£ö | Γ£ö* | | Entity linking | Γ£ö | Γ£ö* |
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
* Key Phrase Extraction * Named Entity Recognition (including PII and PHI) * Entity Linking
+* Sentiment Analysis
+* Opinion Mining
| Element | Valid values | Required? | Usage | ||--|--|-|
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
|`documents` | Includes the `id` and `text` fields below | Required | Contains information for each document being sent, and the raw text of the document. | |`id` | String | Required | The IDs you provide are used to structure the output. | |`text` | Unstructured raw text, up to 125,000 characters. | Required | Must be in the English language, which is the only language currently supported. |
-|`tasks` | Includes the following Text Analytics features: `entityRecognitionTasks`,`entityLinkingTasks`,`keyPhraseExtractionTasks` or `entityRecognitionPiiTasks`. | Required | One or more of the Text Analytics features you want to use. Note that `entityRecognitionPiiTasks` has an optional `domain` parameter that can be set to `pii` or `phi` and the `pii-categories` for detection of selected entity types. If the `domain` parameter is unspecified, the system defaults to `pii`. |
+|`tasks` | Includes the following Text Analytics features: `entityRecognitionTasks`,`entityLinkingTasks`,`keyPhraseExtractionTasks`,`entityRecognitionPiiTasks` or `sentimentAnalysisTasks`. | Required | One or more of the Text Analytics features you want to use. Note that `entityRecognitionPiiTasks` has an optional `domain` parameter that can be set to `pii` or `phi` and the `pii-categories` for detection of selected entity types. If the `domain` parameter is unspecified, the system defaults to `pii`. Similarly `sentimentAnalysisTasks` has the `opinionMining` boolean parameter to include Opinion Mining results in the output for Sentiment Analysis. |
|`parameters` | Includes the `model-version` and `stringIndexType` fields below | Required | This field is included within the above feature tasks that you choose. They contain information about the model version that you want to use and the index type. | |`model-version` | String | Required | Specify which version of the model being called that you want to use. | |`stringIndexType` | String | Required | Specify the text decoder that matches your programming environment. Types supported are `textElement_v8` (default), `unicodeCodePoint`, `utf16CodeUnit`. Please see the [Text offsets article](../concepts/text-offsets.md#offsets-in-api-version-31-preview) for more information. |
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "stringIndexType": "TextElements_v8"
+ "stringIndexType": "TextElements_v8",
+ "loggingOptOut": "false"
+ }
+ }
+ ],
+ "entityRecognitionPiiTasks": [
+ {
+ "parameters": {
+ "model-version": "latest",
+ "stringIndexType": "TextElements_v8",
+ "loggingOptOut": "true",
+ "domain": "phi",
+ "pii-categories":"default"
} } ],
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "stringIndexType": "TextElements_v8"
+ "stringIndexType": "TextElements_v8",
+ "loggingOptOut": "false"
} } ],
- "keyPhraseExtractionTasks": [{
- "parameters": {
- "model-version": "latest"
+ "keyPhraseExtractionTasks": [
+ {
+ "parameters": {
+ "model-version": "latest",
+ "loggingOptOut": "false"
+ }
}
- }],
- "entityRecognitionPiiTasks": [{
- "parameters": {
- "model-version": "latest",
- "stringIndexType": "TextElements_v8",
- "domain": "phi",
- "pii-categories":"default"
+ ],
+ "sentimentAnalysisTasks": [
+ {
+ "parameters": {
+ "model-version": "latest",
+ "stringIndexType": "TextElements_v8",
+ "loggingOptOut": "false",
+ "opinionMining": "false"
+ }
}
- }]
+ ]
} }
In Postman (or another web API test tool), add the endpoint for the feature you
| Feature | Request type | Resource endpoints | |--|--|--|
-| Language detection | POST | `<your-text-analytics-resource>/text/analytics/v3.0/languages` |
-| Sentiment analysis | POST | `<your-text-analytics-resource>/text/analytics/v3.0/sentiment` |
-| Opinion Mining | POST | `<your-text-analytics-resource>/text/analytics/v3.0/sentiment?opinionMining=true` |
-| Key phrase extraction | POST | `<your-text-analytics-resource>/text/analytics/v3.0/keyPhrases` |
-| Named entity recognition - general | POST | `<your-text-analytics-resource>/text/analytics/v3.0/entities/recognition/general` |
-| Named entity recognition - PII | POST | `<your-text-analytics-resource>/text/analytics/v3.0/entities/recognition/pii` |
-| Named entity recognition - PHI | POST | `<your-text-analytics-resource>/text/analytics/v3.0/entities/recognition/pii?domain=phi` |
+| Language Detection | POST | `<your-text-analytics-resource>/text/analytics/v3.0/languages` |
+| Sentiment Analysis | POST | `<your-text-analytics-resource>/text/analytics/v3.0/sentiment` |
+| Opinion Mining | POST | `<your-text-analytics-resource>/text/analytics/v3.1-preview.5/sentiment?opinionMining=true` |
+| Key Phrase Extraction | POST | `<your-text-analytics-resource>/text/analytics/v3.0/keyPhrases` |
+| Named Entity Recognition - General | POST | `<your-text-analytics-resource>/text/analytics/v3.0/entities/recognition/general` |
+| Named Entity Recognition - PII | POST | `<your-text-analytics-resource>/text/analytics/v3.1-preview.5/entities/recognition/pii` |
+| Named Entity Recognition - PHI | POST | `<your-text-analytics-resource>/text/analytics/v3.1-preview.5/entities/recognition/pii?domain=phi` |
+| Entity Linking | POST | `<your-text-analytics-resource>/text/analytics/v3.0/entities/linking` |
#### [Asynchronous](#tab/asynchronous)
In Postman (or another web API test tool), add the endpoint for the feature you
| Feature | Request type | Resource endpoints | |--|--|--|
-| Submit analysis job | POST | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.4/analyze` |
-| Get analysis status and results | GET | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.4/analyze/jobs/<Operation-Location>` |
+| Submit analysis job | POST | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.5/analyze` |
+| Get analysis status and results | GET | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.5/analyze/jobs/<Operation-Location>` |
### Endpoints for sending asynchronous requests to the `/health` endpoint | Feature | Request type | Resource endpoints | |--|--|--|
-| Submit Text Analytics for health job | POST | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.4/entities/health/jobs` |
-| Get job status and results | GET | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.4/entities/health/jobs/<Operation-Location>` |
-| Cancel job | DELETE | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.4/entities/health/jobs/<Operation-Location>` |
+| Submit Text Analytics for health job | POST | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.5/entities/health/jobs` |
+| Get job status and results | GET | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.5/entities/health/jobs/<Operation-Location>` |
+| Cancel job | DELETE | `https://<your-text-analytics-resource>/text/analytics/v3.1-preview.5/entities/health/jobs/<Operation-Location>` |
Submit the API request. If you made the call to a synchronous endpoint, the resp
If you made the call to the asynchronous `/analyze` or `/health` endpoints, check that you received a 202 response code. you will need to get the response to view the results: 1. In the API response, find the `Operation-Location` from the header, which identifies the job you sent to the API.
-2. Create a GET request for the endpoint you used. refer to the [table above](#set-up-a-request) for the endpoint format, and review the [API reference documentation](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/AnalyzeStatus). For example:
+2. Create a GET request for the endpoint you used. refer to the [table above](#set-up-a-request) for the endpoint format, and review the [API reference documentation](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/AnalyzeStatus). For example:
- `https://my-resource.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/analyze/jobs/<Operation-Location>`
+ `https://my-resource.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/analyze/jobs/<Operation-Location>`
3. Add the `Operation-Location` to the request.
The synchronous endpoint responses will vary depending on the endpoint you use.
If successful, the GET request to the `/analyze` endpoint will return an object containing the assigned tasks. For example `keyPhraseExtractionTasks`. These tasks contain the response object from the appropriate Text Analytics feature. See the following articles for more information. + [Key phrase extraction](text-analytics-how-to-keyword-extraction.md#step-3-view-results)++ [Sentiment analysis](text-analytics-how-to-sentiment-analysis.md#view-the-results) + [Entity recognition](text-analytics-how-to-entity-linking.md#view-results) + [Text Analytics for health](text-analytics-for-health.md#hosted-asynchronous-web-api-response)
cognitive-services Text Analytics How To Entity Linking https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-entity-linking.md
Previously updated : 03/15/2021 Last updated : 04/14/2021
The PII feature is part of NER and it can identify and redact sensitive entities
## Named Entity Recognition features and versions
-| Feature | NER v3.0 | NER v3.1-preview.4 |
+| Feature | NER v3.0 | NER v3.1-preview.5 |
|--|--|-| | Methods for single, and batch requests | X | X | | Expanded entity recognition across several categories | X | X |
See [language support](../language-support.md) for information.
Named Entity Recognition v3 provides expanded detection across multiple types. Currently, NER v3.0 can recognize entities in the [general entity category](../named-entity-types.md).
-Named Entity Recognition v3.1-preview.4 includes the detection capabilities of v3.0, and:
-* The ability to detect personal information (`PII`) using the `v3.1-preview.4/entities/recognition/pii` endpoint.
+Named Entity Recognition v3.1-preview.5 includes the detection capabilities of v3.0, and:
+* The ability to detect personal information (`PII`) using the `v3.1-preview.5/entities/recognition/pii` endpoint.
* An optional `domain=phi` parameter to detect confidential health information (`PHI`). * [Asynchronous operation](text-analytics-how-to-call-api.md) using the `/analyze` endpoint.
Create a POST request. You can [use Postman](text-analytics-how-to-call-api.md)
#### [Version 3.1-preview](#tab/version-3-preview)
-Named Entity Recognition `v3.1-preview.4` uses separate endpoints for NER, PII, and entity linking requests. Use a URL format below based on your request.
+Named Entity Recognition `v3.1-preview.5` uses separate endpoints for NER, PII, and entity linking requests. Use a URL format below based on your request.
**Entity linking**
-* `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/entities/linking`
+* `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/linking`
-[Named Entity Recognition version 3.1-preview reference for `Linking`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-4/operations/EntitiesLinking)
+[Named Entity Recognition version 3.1-preview reference for `Linking`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-5/operations/EntitiesLinking)
**Named Entity Recognition**
-* General entities - `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/entities/recognition/general`
+* General entities - `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/recognition/general`
-[Named Entity Recognition version 3.1-preview reference for `General`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-4/operations/EntitiesRecognitionGeneral)
+[Named Entity Recognition version 3.1-preview reference for `General`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-5/operations/EntitiesRecognitionGeneral)
**Personally Identifiable Information (PII)**
-* Personal (`PII`) information - `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/entities/recognition/pii`
+* Personal (`PII`) information - `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/recognition/pii`
You can also use the optional `domain=phi` parameter to detect health (`PHI`) information in text.
-`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/entities/recognition/pii?domain=phi`
+`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/recognition/pii?domain=phi`
-Starting in `v3.1-preview.4`, The JSON response includes a `redactedText` property, which contains the modified input text where the detected PII entities are replaced by an `*` for each character in the entities.
+Starting in `v3.1-preview.5`, The JSON response includes a `redactedText` property, which contains the modified input text where the detected PII entities are replaced by an `*` for each character in the entities.
-[Named Entity Recognition version 3.1-preview reference for `PII`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-4/operations/EntitiesRecognitionPii)
+[Named Entity Recognition version 3.1-preview reference for `PII`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-5/operations/EntitiesRecognitionPii)
The API will attempt to detect the [listed entity categories](../named-entity-types.md?tabs=personal) for a given document language. If you want to specify which entities will be detected and returned, use the optional pii-categories parameter with the appropriate entity categories. This parameter can also let you detect entities that aren't enabled by default for your document language. For example, a French driver's license number that might occur in English text.
-`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/entities/recognition/pii?piiCategories=[FRDriversLicenseNumber]`
+`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/recognition/pii?piiCategories=[FRDriversLicenseNumber]`
**Asynchronous operation**
-Starting in `v3.1-preview.4`, You can send NER and entity linking requests asynchronously using the `/analyze` endpoint.
+Starting in `v3.1-preview.5`, You can send NER and entity linking requests asynchronously using the `/analyze` endpoint.
-* Asynchronous operation - `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/analyze`
+* Asynchronous operation - `https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/analyze`
See [How to call the Text Analytics API](text-analytics-how-to-call-api.md) for information on sending asynchronous requests.
cognitive-services Text Analytics How To Keyword Extraction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-keyword-extraction.md
All POST requests return a JSON formatted response with the IDs and detected pro
Output is returned immediately. You can stream the results to an application that accepts JSON or save the output to a file on the local system, and then import it into an application that allows you to sort, search, and manipulate the data.
-An example of the output for key phrase extraction from the v3.1-preview.2 endpoint is shown here:
+An example of the output for key phrase extraction from the v3.1-preview endpoint is shown here:
### Synchronous result
cognitive-services Text Analytics How To Sentiment Analysis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-sentiment-analysis.md
Previously updated : 03/29/2021 Last updated : 04/14/2021
Create a POST request. You can [use Postman](text-analytics-how-to-call-api.md)
#### [Version 3.1-preview](#tab/version-3-1)
-[Sentiment Analysis v3.1 reference](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/Sentiment)
+[Sentiment Analysis v3.1 reference](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/Sentiment)
#### [Version 3.0](#tab/version-3)
Set the HTTPS endpoint for sentiment analysis by using either a Text Analytics r
**Sentiment Analysis**
-`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/sentiment`
+`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/sentiment`
**Opinion Mining** To get Opinion Mining results, you must include the `opinionMining=true` parameter. For example:
-`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.4/sentiment?opinionMining=true`
+`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/sentiment?opinionMining=true`
This parameter is set to `false` by default.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/language-support.md
Previously updated : 02/23/2021 Last updated : 05/18/2021 # Text Analytics API v3 language support
If you have content expressed in a less frequently used language, you can try La
|Tongan|`to`|Γ£ô|2020-09-01| |Turkish|`tr`|Γ£ô|2021-01-05| |Turkmen|`tk`|Γ£ô|2021-01-05|
+|Ukrainian|`uk`|Γ£ô||
+|Urdu|`ur`|Γ£ô||
+|Uzbek|`uz`|Γ£ô||
+|Vietnamese|`vi`|Γ£ô||
+|Welsh|`cy`|Γ£ô||
|Xhosa|`xh`|Γ£ô|2021-01-05|
+|Yiddish|`yi`|Γ£ô||
|Yoruba|`yo`|Γ£ô|2021-01-05|
+|Yucatec Maya| `yua` | Γ£ô| |
|Zulu|`zu`|Γ£ô|2021-01-05| + ## See also
cognitive-services Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/migration-guide.md
Previously updated : 01/22/2021 Last updated : 04/14/2021
If your application uses the REST API, update its request endpoint to the v3 end
See the reference documentation for examples of the JSON response. * [Version 2.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v2-1/operations/56f30ceeeda5650db055a3c9) * [Version 3.0](https://westus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-0/operations/Sentiment)
-* [Version 3.1-preview](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/Sentiment)
+* [Version 3.1-preview](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/Sentiment)
#### Client libraries
You will also need to update your application to use the [entity categories](nam
See the reference documentation for examples of the JSON response. * [Version 2.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v2-1/operations/5ac4251d5b4ccd1554da7634) * [Version 3.0](https://westus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-0/operations/EntitiesRecognitionGeneral)
-* [Version 3.1-preview](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/EntitiesRecognitionGeneral)
+* [Version 3.1-preview](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/EntitiesRecognitionGeneral)
#### Client libraries
If your application uses the REST API, update its request endpoint to the v3 end
See the reference documentation for examples of the JSON response. * [Version 2.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v2-1/operations/56f30ceeeda5650db055a3c7) * [Version 3.0](https://westus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-0/operations/Languages)
-* [Version 3.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/operations/Languages)
+* [Version 3.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/Languages)
#### Client libraries
If your application uses the REST API, update its request endpoint to the v3 end
See the reference documentation for examples of the JSON response. * [Version 2.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v2-1/operations/56f30ceeeda5650db055a3c6) * [Version 3.0](https://westus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-0/operations/KeyPhrases)
-* [Version 3.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-1/operations/KeyPhrases)
+* [Version 3.1](https://westcentralus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/operations/KeyPhrases)
#### Client libraries
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/overview.md
Previously updated : 03/29/2021 Last updated : 04/14/2021 keywords: text mining, sentiment analysis, text analytics
The Text Analytics API is a cloud-based service that provides Natural Language Processing (NLP) features for text mining and text analysis, including: sentiment analysis, opinion mining, key phrase extraction, language detection, and named entity recognition.
-The API is a part of [Azure Cognitive Services](../index.yml), a collection of machine learning and AI algorithms in the cloud for your development projects. You can use these features with the REST API [version 3.0](https://westus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-V3-0/) or [version 3.1-preview](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-3/), or the [client library](quickstarts/client-libraries-rest-api.md).
+The API is a part of [Azure Cognitive Services](../index.yml), a collection of machine learning and AI algorithms in the cloud for your development projects. You can use these features with the REST API [version 3.0](https://westus.dev.cognitive.microsoft.com/docs/services/TextAnalytics-V3-0/) or [version 3.1-preview](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-preview-5/), or the [client library](quickstarts/client-libraries-rest-api.md).
> [!VIDEO https://channel9.msdn.com/Shows/AI-Show/Whats-New-in-Text-Analytics-Opinion-Mining-and-Async-API/player]
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/whats-new.md
Previously updated : 03/25/2021 Last updated : 05/17/2021
The Text Analytics API is updated on an ongoing basis. To stay up-to-date with recent developments, this article provides you with information about new releases and features.
+## May 2021
+
+### General API updates
+
+* Release of the new API v3.1-preview.5 which includes
+ * Asynchronous [Analyze API](how-tos/text-analytics-how-to-call-api.md?tabs=asynchronous) now supports Sentiment Analysis (SA) and Opinion Mining (OM).
+ * A new query parameter, `LoggingOptOut`, is now available for customers who wish to opt out of logging input text for incident reports. Learn more about this parameter in the [data privacy](/legal/cognitive-services/text-analytics/data-privacy?context=/azure/cognitive-services/text-analytics/context/context) article.
+* Text Analytics for health and the Analyze asynchronous operations are now available in all regions
+ ## March 2021 ### General API updates
communication-services Troubleshooting Info https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/troubleshooting-info.md
The Azure Communication Services Calling SDK uses the following error codes to h
| 500, 503, 504 | Communication Services infrastructure error. | File a support request through the Azure portal. | | 603 | Call globally declined by remote Communication Services participant | Expected behavior. |
+## Chat SDK error codes
+
+The Azure Communication Services Chat SDK uses the following error codes to help you troubleshoot chat issues. The error codes are exposed through the `error.code` property in the error response.
+
+| Error code | Description | Action to take |
+| -- | | |
+| 401 | Unauthorized | Ensure that your Communication Services token is valid and not expired. |
+| 403 | Forbidden | Ensure that the initiator of the request has access to the resource. |
+| 429 | Too many requests | Ensure that your client-side application handles this scenario in a user-friendly manner. If the error persists please file a support request. |
+| 503 | Service Unavailable | File a support request through the Azure portal. |
+ ## Related information - [Logs and diagnostics](logging-and-diagnostics.md) - [Metrics](metrics.md)
connectors Apis List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/connectors/apis-list.md
An *action* is an operation that follows the trigger and performs some kind of t
## Connector categories
-In Logic Apps, most triggers and actions are available in either a *built-in* version or *managed connector* version. A small number of triggers and actions are available in both versions. The versions available depend on whether you create a multi-tenant logic app or a single-tenant logic app, which is currently available only in [single-tenant Azure Logic Apps](../logic-apps/single-tenant-overview-compare.md).
+In Logic Apps, most triggers and actions are available in either a *built-in* version or *managed connector* version. A small number of triggers and actions are available in both versions. The versions available depend on whether you create a multi-tenant logic app or a single-tenant logic app, which is currently available only in [Logic Apps Preview](../logic-apps/single-tenant-overview-compare.md).
[Built-in triggers and actions](built-in.md) run natively on the Logic Apps runtime, don't require creating connections, and perform these kinds of tasks:
connectors Connectors Create Api Mq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/connectors/connectors-create-api-mq.md
Previously updated : 05/25/2021 Last updated : 04/26/2021 tags: connectors
This connector includes a Microsoft MQ client that communicates with a remote MQ
## Available operations
-* Multi-tenant Azure Logic Apps: When you create a **Logic App (Consumption)** resource, you can connect to an MQ server only by using the *managed* MQ connector. This connector provides only actions, no triggers.
+The IBM MQ connector provides actions but no triggers.
-* Single-tenant Azure Logic Apps: When you create a single-tenant based logic app workflow, you can connect to an MQ server by using either the managed MQ connector, which includes *only* actions, or the *built-in* MQ operations, which includes triggers *and* actions.
+* Multi-tenant Azure Logic Apps: When you create a consumption-based logic app workflow, you can connect to an MQ server by using the *managed* MQ connector.
+
+* Single-tenant Azure Logic Apps (preview): When you create a preview logic app workflow, you can connect to an MQ server by using either the managed MQ connector or the *built-in* MQ operations (preview).
For more information about the difference between a managed connector and built-in operations, review [key terms in Logic Apps](../logic-apps/logic-apps-overview.md#logic-app-concepts).
The following list describes only some of the managed operations available for M
For all the managed connector operations and other technical information, such as properties, limits, and so on, review the [MQ connector's reference page](/connectors/mq/).
-#### [Built-in](#tab/built-in)
+#### [Built-in (preview)](#tab/built-in)
The following list describes only some of the built-in operations available for MQ:
-* When a message is available in a queue, take some action.
-* When one or more messages are received from a queue (auto-complete), take some action.
-* When one or more messages are received from a queue (peek-lock), take some action.
-* Receive a single message or an array of messages from a queue. For multiple messages, you can specify the maximum number of messages to return per batch and the maximum batch size in KB.
+* Receive a single message or an array of messages from the MQ server. For multiple messages, you can specify the maximum number of messages to return per batch and the maximum batch size in KB.
* Send a single message or an array of messages to the MQ server. These built-in MQ operations also have the following capabilities plus the benefits from all the other capabilities for logic apps in the [single-tenant Logic Apps service](../logic-apps/single-tenant-overview-compare.md):
When you add an MQ action for the first time, you're prompted to create a connec
1. When you're done, select **Create**.
-#### [Built-in](#tab/built-in)
+#### [Built-in (preview)](#tab/built-in)
1. Provide the connection information for your MQ server.
container-instances Container Instances Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-instances/container-instances-faq.md
See more [detailed guidance](container-instances-troubleshooting.md#container-ta
### What Windows base OS images are supported? > [!NOTE]
-> Due to issues with backwards compatibility after the Windows updates in 2020, the following image versions include the minimum version number that we recommend you use in your base image. Current deployments using older image versions are not impacted, but new deployments should adhere to the following base images.
+> Due to issues with backward compatibility after the Windows updates in 2020, the following image versions include the minimum version number that we recommend you use in your base image. Current deployments using older image versions are not impacted, but new deployments should adhere to the following base images. After June 14th, 2021, ACI will no longer support deployments using older version numbers.
#### Windows Server 2016 base images
-* [Nano Server](https://hub.docker.com/_/microsoft-windows-nanoserver): `sac2016`, `10.0.14393.3506` or newer
-* [Windows Server Core](https://hub.docker.com/_/microsoft-windows-servercore): `ltsc2016`, `10.0.14393.3506` or newer
+* [Nano Server](https://hub.docker.com/_/microsoft-windows-nanoserver): `sac2016`, `10.0.14393.3568` or newer
+* [Windows Server Core](https://hub.docker.com/_/microsoft-windows-servercore): `ltsc2016`, `10.0.14393.3568` or newer
> [!NOTE] > Windows images based on Semi-Annual Channel release 1709 or 1803 are not supported.
container-registry Container Registry Auto Purge https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-auto-purge.md
Title: Purge tags and manifests description: Use a purge command to delete multiple tags and manifests from an Azure container registry based on age and a tag filter, and optionally schedule purge operations. Previously updated : 02/19/2021 Last updated : 05/07/2021 # Automatically purge images from an Azure container registry When you use an Azure container registry as part of a development workflow, the registry can quickly fill up with images or other artifacts that aren't needed after a short period. You might want to delete all tags that are older than a certain duration or match a specified name filter. To delete multiple artifacts quickly, this article introduces the `acr purge` command you can run as an on-demand or [scheduled](container-registry-tasks-scheduled.md) ACR Task.
-The `acr purge` command is currently distributed in a public container image (`mcr.microsoft.com/acr/acr-cli:0.4`), built from source code in the [acr-cli](https://github.com/Azure/acr-cli) repo in GitHub.
+The `acr purge` command is currently distributed in a public container image (`mcr.microsoft.com/acr/acr-cli:0.4`), built from source code in the [acr-cli](https://github.com/Azure/acr-cli) repo in GitHub. `acr purge` is currently in preview.
You can use the Azure Cloud Shell or a local installation of the Azure CLI to run the ACR task examples in this article. If you'd like to use it locally, version 2.0.76 or later is required. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][azure-cli-install].
-> [!IMPORTANT]
-> This feature is currently in preview. Previews are made available to you on the condition that you agree to the [supplemental terms of use][terms-of-use]. Some aspects of this feature may change prior to general availability (GA).
- > [!WARNING] > Use the `acr purge` command with caution--deleted image data is UNRECOVERABLE. If you have systems that pull images by manifest digest (as opposed to image name), you should not purge untagged images. Deleting untagged images will prevent those systems from pulling the images from your registry. Instead of pulling by manifest, consider adopting a *unique tagging* scheme, a [recommended best practice](container-registry-image-tag-version.md).
container-registry Container Registry Delete https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-delete.md
Title: Delete image resources description: Details on how to effectively manage registry size by deleting container image data using Azure CLI commands. Previously updated : 07/31/2019 Last updated : 05/07/2021
-# Delete container images in Azure Container Registry using the Azure CLI
+# Delete container images in Azure Container Registry
To maintain the size of your Azure container registry, you should periodically delete stale image data. While some container images deployed into production may require longer-term storage, others can typically be deleted more quickly. For example, in an automated build and test scenario, your registry can quickly fill with images that might never be deployed, and can be purged shortly after completing the build and test pass.
Because you can delete image data in several different ways, it's important to u
* Delete by [tag](#delete-by-tag): Deletes an image, the tag, all unique layers referenced by the image, and all other tags associated with the image. * Delete by [manifest digest](#delete-by-manifest-digest): Deletes an image, all unique layers referenced by the image, and all tags associated with the image.
-Sample scripts are provided to help automate delete operations.
- For an introduction to these concepts, see [About registries, repositories, and images](container-registry-concepts.md).
+> [!NOTE]
+> After you delete image data, Azure Container Registry stops billing you immediately for the associated storage. However, the registry recovers the associated storage space using an asynchronous process. It takes some time before the registry cleans up layers and shows the updated storage usage.
+ ## Delete repository Deleting a repository deletes all of the images in the repository, including all tags, unique layers, and manifests. When you delete a repository, you recover the storage space used by the images that reference unique layers in that repository.
The `acr-helloworld:v2` image is deleted from the registry, as is any layer data
To maintain the size of a repository or registry, you might need to periodically delete manifest digests older than a certain date.
-The following Azure CLI command lists all manifest digest in a repository older than a specified timestamp, in ascending order. Replace `<acrName>` and `<repositoryName>` with values appropriate for your environment. The timestamp could be a full date-time expression or a date, as in this example.
+The following Azure CLI command lists all manifest digests in a repository older than a specified timestamp, in ascending order. Replace `<acrName>` and `<repositoryName>` with values appropriate for your environment. The timestamp could be a full date-time expression or a date, as in this example.
```azurecli az acr repository show-manifests --name <acrName> --repository <repositoryName> \
As mentioned in the [Manifest digest](container-registry-concepts.md#manifest-di
As you can see in the output of the last step in the sequence, there is now an orphaned manifest whose `"tags"` property is an empty list. This manifest still exists within the registry, along with any unique layer data that it references. **To delete such orphaned images and their layer data, you must delete by manifest digest**.
-## Delete all untagged images
-
-You can list all untagged images in your repository using the following Azure CLI command. Replace `<acrName>` and `<repositoryName>` with values appropriate for your environment.
-
-```azurecli
-az acr repository show-manifests --name <acrName> --repository <repositoryName> --query "[?tags[0]==null].digest"
-```
-
-Using this command in a script, you can delete all untagged images in a repository.
-
-> [!WARNING]
-> Use the following sample scripts with caution--deleted image data is UNRECOVERABLE. If you have systems that pull images by manifest digest (as opposed to image name), you should not run these scripts. Deleting untagged images will prevent those systems from pulling the images from your registry. Instead of pulling by manifest, consider adopting a *unique tagging* scheme, a [recommended best practice](container-registry-image-tag-version.md).
-
-**Azure CLI in Bash**
-
-The following Bash script deletes all untagged images from a repository. It requires the Azure CLI and **xargs**. By default, the script performs no deletion. Change the `ENABLE_DELETE` value to `true` to enable image deletion.
-
-```bash
-#!/bin/bash
-
-# WARNING! This script deletes data!
-# Run only if you do not have systems
-# that pull images via manifest digest.
-
-# Change to 'true' to enable image delete
-ENABLE_DELETE=false
-
-# Modify for your environment
-REGISTRY=myregistry
-REPOSITORY=myrepository
+## Automatically purge tags and manifests
-# Delete all untagged (orphaned) images
-if [ "$ENABLE_DELETE" = true ]
-then
- az acr repository show-manifests --name $REGISTRY --repository $REPOSITORY --query "[?tags[0]==null].digest" -o tsv \
- | xargs -I% az acr repository delete --name $REGISTRY --image $REPOSITORY@% --yes
-else
- echo "No data deleted."
- echo "Set ENABLE_DELETE=true to enable image deletion of these images in $REPOSITORY:"
- az acr repository show-manifests --name $REGISTRY --repository $REPOSITORY --query "[?tags[0]==null]" -o tsv
-fi
-```
-
-**Azure CLI in PowerShell**
-
-The following PowerShell script deletes all untagged images from a repository. It requires PowerShell and the Azure CLI. By default, the script performs no deletion. Change the `$enableDelete` value to `$TRUE` to enable image deletion.
-
-```powershell
-# WARNING! This script deletes data!
-# Run only if you do not have systems
-# that pull images via manifest digest.
-
-# Change to '$TRUE' to enable image delete
-$enableDelete = $FALSE
-
-# Modify for your environment
-$registry = "myregistry"
-$repository = "myrepository"
-
-if ($enableDelete) {
- az acr repository show-manifests --name $registry --repository $repository --query "[?tags[0]==null].digest" -o tsv `
- | %{ az acr repository delete --name $registry --image $repository@$_ --yes }
-} else {
- Write-Host "No data deleted."
- Write-Host "Set `$enableDelete = `$TRUE to enable image deletion."
- az acr repository show-manifests --name $registry --repository $repository --query "[?tags[0]==null]" -o tsv
-}
-```
+Azure Container Registry provides the following automated methods to remove tags and manifests, and their associated unique layer data:
+* Create an ACR task that runs the `acr purge` container command to delete all tags that are older than a certain duration or match a specified name filter. Optionally configure `acr purge` to delete untagged manifests.
-## Automatically purge tags and manifests (preview)
+ The `acr purge` container command is currently in preview. For more information, see [Automatically purge images from an Azure container registry](container-registry-auto-purge.md).
-As an alternative to scripting Azure CLI commands, run an on-demand or scheduled ACR task to delete all tags that are older than a certain duration or match a specified name filter. For more information, see [Automatically purge images from an Azure container registry](container-registry-auto-purge.md).
+* Optionally set a [retention policy](container-registry-retention-policy.md) for each registry, to manage untagged manifests. When you enable a retention policy, image manifests in the registry that don't have any associated tags, and the underlying layer data, are automatically deleted after a set period.
-Optionally set a [retention policy](container-registry-retention-policy.md) for each registry, to manage untagged manifests. When you enable a retention policy, image manifests in the registry that don't have any associated tags, and the underlying layer data, are automatically deleted after a set period.
+ The retention policy is currently a preview feature of **Premium** container registries. The retention policy only applies to untagged manifests created after the policy takes effect.
## Next steps
-For more information about image storage in Azure Container Registry see [Container image storage in Azure Container Registry](container-registry-storage.md).
+For more information about image storage in Azure Container Registry, see [Container image storage in Azure Container Registry](container-registry-storage.md).
<!-- IMAGES --> [manifest-digest]: ./media/container-registry-delete/01-manifest-digest.png
container-registry Container Registry Retention Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-retention-policy.md
Title: Policy to retain untagged manifests
-description: Learn how to enable a retention policy in your Azure container registry, for automatic deletion of untagged manifests after a defined period.
+description: Learn how to enable a retention policy in your Premium Azure container registry, for automatic deletion of untagged manifests after a defined period.
Previously updated : 10/02/2019 Last updated : 04/26/2021 # Set a retention policy for untagged manifests
-Azure Container Registry gives you the option to set a *retention policy* for stored image manifests that don't have any associated tags (*untagged manifests*). When a retention policy is enabled, untagged manifests in the registry are automatically deleted after a number of days you set. This feature prevents the registry from filling up with artifacts that aren't needed and helps you save on storage costs. If the `delete-enabled` attribute of an untagged manifest is set to `false`, the manifest can't be deleted, and the retention policy doesn't apply.
+Azure Container Registry gives you the option to set a *retention policy* for stored image manifests that don't have any associated tags (*untagged manifests*). When a retention policy is enabled, untagged manifests in the registry are automatically deleted after a number of days you set. This feature prevents the registry from filling up with artifacts that aren't needed and helps you save on storage costs.
You can use the Azure Cloud Shell or a local installation of the Azure CLI to run the command examples in this article. If you'd like to use it locally, version 2.0.74 or later is required. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][azure-cli].
-A retention policy is a feature of **Premium** container registries. For information about registry service tiers, see [Azure Container Registry service tiers](container-registry-skus.md).
-
-> [!IMPORTANT]
-> This feature is currently in preview, and some [limitations apply](#preview-limitations). Previews are made available to you on the condition that you agree to the [supplemental terms of use][terms-of-use]. Some aspects of this feature may change prior to general availability (GA).
+A retention policy for untagged manifests is currently a preview feature of **Premium** container registries. For information about registry service tiers, see [Azure Container Registry service tiers](container-registry-skus.md).
> [!WARNING] > Set a retention policy with care--deleted image data is UNRECOVERABLE. If you have systems that pull images by manifest digest (as opposed to image name), you should not set a retention policy for untagged manifests. Deleting untagged images will prevent those systems from pulling the images from your registry. Instead of pulling by manifest, consider adopting a *unique tagging* scheme, a [recommended best practice](container-registry-image-tag-version.md).
-## Preview limitations
-
-* You can only set a retention policy for untagged manifests.
-* The retention policy currently applies only to manifests that are untagged *after* the policy is enabled. Existing untagged manifests in the registry aren't subject to the policy. To delete existing untagged manifests, see examples in [Delete container images in Azure Container Registry](container-registry-delete.md).
- ## About the retention policy Azure Container Registry does reference counting for manifests in the registry. When a manifest is untagged, it checks the retention policy. If a retention policy is enabled, a manifest delete operation is queued, with a specific date, according to the number of days set in the policy. A separate queue management job constantly processes messages, scaling as needed. As an example, suppose you untagged two manifests, 1 hour apart, in a registry with a retention policy of 30 days. Two messages would be queued. Then, 30 days later, approximately 1 hour apart, the messages would be retrieved from the queue and processed, assuming the policy was still in effect.
+If the `delete-enabled` attribute of an untagged manifest is set to `false`, the manifest is locked and is not deleted by the policy.
+
+> [!IMPORTANT]
+> The retention policy applies only to untagged manifests with timestamps *after* the policy is enabled. Untagged manifests in the registry with earlier timestamps aren't subject to the policy. For other options to delete image data, see examples in [Delete container images in Azure Container Registry](container-registry-delete.md).
+ ## Set a retention policy - CLI The following example shows you how to use the Azure CLI to set a retention policy for untagged manifests in a registry.
az acr config retention update --registry myregistry --status enabled --days 30
The following example sets a policy to delete any manifest in the registry as soon as it's untagged. Create this policy by setting a retention period of 0 days. ```azurecli
-az acr config retention update --registry myregistry --status enabled --days 0 --type UntaggedManifests
+az acr config retention update \
+ --registry myregistry --status enabled \
+ --days 0 --type UntaggedManifests
``` ### Validate a retention policy
If you enable the preceding policy with a retention period of 0 days, you can qu
1. Push a test image `hello-world:latest` image to your registry, or substitute another test image of your choice. 1. Untag the `hello-world:latest` image, for example, using the [az acr repository untag][az-acr-repository-untag] command. The untagged manifest remains in the registry. ```azurecli
- az acr repository untag --name myregistry --image hello-world:latest
+ az acr repository untag \
+ --name myregistry --image hello-world:latest
``` 1. Within a few seconds, the untagged manifest is deleted. You can verify the deletion by listing manifests in the repository, for example, using the [az acr repository show-manifests][az-acr-repository-show-manifests] command. If the test image was the only one in the repository, the repository itself is deleted.
-### Disable a retention policy
+### Manage a retention policy
-To see the retention policy set in a registry, run the [az acr config retention show][az-acr-config-retention-show] command:
+To show the retention policy set in a registry, run the [az acr config retention show][az-acr-config-retention-show] command:
```azurecli az acr config retention show --registry myregistry
az acr config retention show --registry myregistry
To disable a retention policy in a registry, run the [az acr config retention update][az-acr-config-retention-update] command and set `--status disabled`: ```azurecli
-az acr config retention update --registry myregistry --status disabled --type UntaggedManifests
+az acr config retention update \
+ --registry myregistry --status disabled \
+ --type UntaggedManifests
``` ## Set a retention policy - portal
-You can also set a registry's retention policy in the [Azure portal](https://portal.azure.com). The following example shows you how to use the portal to set a retention policy for untagged manifests in a registry.
+You can also set a registry's retention policy in the [Azure portal](https://portal.azure.com).
### Enable a retention policy
cosmos-db How To Create Container https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-create-container.md
This article explains the different ways to create a container in Azure Cosmos D
1. Open the **Data Explorer** pane, and select **New Container**. Next, provide the following details: * Indicate whether you are creating a new database or using an existing one.
- * Enter a container ID.
- * Enter a partition key.
- * Enter a throughput to be provisioned (for example, 1000 RUs).
+ * Enter a **Container Id**.
+ * Enter a **Partition key** value (for example, `/ItemID`).
+ * Select **Autoscale** or **Manual** throughput and enter the required **Container throughput** (for example, 1000 RU/s). Enter a throughput that you want to provision (for example, 1000 RUs).
* Select **OK**.
- :::image type="content" source="./media/how-to-create-container/partitioned-collection-create-sql.png" alt-text="Screenshot of Data Explorer pane, with New Container highlighted":::
+ :::image type="content" source="./media/how-to-provision-container-throughput/provision-container-throughput-portal-sql-api.png" alt-text="Screenshot of Data Explorer, with New Collection highlighted":::
## <a id="cli-sql"></a>Create a container using Azure CLI
cosmos-db How To Provision Autoscale Throughput https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-provision-autoscale-throughput.md
If you are using a different API, see [API for MongoDB](how-to-provision-through
1. Navigate to your Azure Cosmos DB account and open the **Data Explorer** tab.
-1. Select **New Container.** Enter a name for your database, container, and a partition key. Under **Throughput**, select the **autoscale** option, and set the [maximum throughput (RU/s)](provision-throughput-autoscale.md#how-autoscale-provisioned-throughput-works) that you want the database or container to scale to.
+1. Select **New Container.** Enter a name for your database, container, and a partition key. Under database or container throughput, select the **Autoscale** option, and set the [maximum throughput (RU/s)](provision-throughput-autoscale.md#how-autoscale-provisioned-throughput-works) that you want the database or container to scale to.
:::image type="content" source="./media/how-to-provision-autoscale-throughput/create-new-autoscale-container.png" alt-text="Creating a container and configuring autoscale provisioned throughput":::
cosmos-db How To Provision Container Throughput https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-provision-container-throughput.md
If you are using a different API, see [API for MongoDB](how-to-provision-through
1. Open the **Data Explorer** pane, and select **New Container**. Next, provide the following details: * Indicate whether you are creating a new database or using an existing one.
- * Enter a Container ID.
- * Enter a partition key value (for example, `/ItemID`).
- * Enter a throughput that you want to provision (for example, 1000 RUs).
+ * Enter a **Container Id**.
+ * Enter a **Partition key** value (for example, `/ItemID`).
+ * Select **Autoscale** or **Manual** throughput and enter the required **Container throughput** (for example, 1000 RU/s). Enter a throughput that you want to provision (for example, 1000 RUs).
* Select **OK**. :::image type="content" source="./media/how-to-provision-container-throughput/provision-container-throughput-portal-sql-api.png" alt-text="Screenshot of Data Explorer, with New Collection highlighted":::
cosmos-db How To Provision Database Throughput https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-provision-database-throughput.md
If you are using a different API, see [API for MongoDB](how-to-provision-through
1. Open the **Data Explorer** pane, and select **New Database**. Provide the following details: * Enter a database ID.
- * Select the **Provision database throughput** option.
- * Enter a throughput (for example, 1000 RUs).
+ * Select the **Share throughput across containers** option.
+ * Select **Autoscale** or **Manual** throughput and enter the required **Database throughput** (for example, 1000 RU/s).
+ * Enter a name for your container under **Container ID**
+ * Enter a **Partition key**
* Select **OK**. :::image type="content" source="./media/how-to-provision-database-throughput/provision-database-throughput-portal-sql-api.png" alt-text="Screenshot of New Database dialog box":::
cosmos-db How To Use Stored Procedures Triggers Udfs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-use-stored-procedures-triggers-udfs.md
const newItem = [{
}]; const container = client.database("myDatabase").container("myContainer"); const sprocId = "spCreateToDoItems";
-const {body: result} = await container.scripts.storedProcedure(sprocId).execute(newItem, {partitionKey: newItem[0].category});
+const {resource: result} = await container.scripts.storedProcedure(sprocId).execute(newItem, {partitionKey: newItem[0].category});
``` ### Stored procedures - Python SDK
data-factory Connector Azure Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-cosmos-db.md
Previously updated : 03/17/2021 Last updated : 05/18/2021 # Copy and transform data in Azure Cosmos DB (SQL API) by using Azure Data Factory
This Azure Cosmos DB (SQL API) connector is supported for the following activiti
For Copy activity, this Azure Cosmos DB (SQL API) connector supports: -- Copy data from and to the Azure Cosmos DB [SQL API](../cosmos-db/introduction.md).
+- Copy data from and to the Azure Cosmos DB [SQL API](../cosmos-db/introduction.md) using key, service principal, or managed identities for Azure resources authentications.
- Write to Azure Cosmos DB as **insert** or **upsert**. - Import and export JSON documents as-is, or copy data from or to a tabular dataset. Examples include a SQL database and a CSV file. To copy documents as-is to or from JSON files or to or from another Azure Cosmos DB collection, see [Import and export JSON documents](#import-and-export-json-documents).
The following sections provide details about properties you can use to define Da
## Linked service properties
-The following properties are supported for the Azure Cosmos DB (SQL API) linked service:
+The Azure Cosmos DB (SQL API) connector supports the following authentication types. See the corresponding sections for details:
+
+- [Key authentication](#key-authentication)
+- [Service principal authentication (Preview)](#service-principal-authentication)
+- [Managed identities for Azure resources authentication (Preview)](#managed-identity)
+
+### Key authentication
| Property | Description | Required | |: |: |: |
The following properties are supported for the Azure Cosmos DB (SQL API) linked
} ```
+### <a name="service-principal-authentication"></a> Service principal authentication (Preview)
+
+>[!NOTE]
+>Currently, the service principal authentication is not supported in data flow.
+
+To use service principal authentication, follow these steps.
+
+1. Register an application entity in Azure Active Directory (Azure AD) by following the steps in [Register your application with an Azure AD tenant](../storage/common/storage-auth-aad-app.md#register-your-application-with-an-azure-ad-tenant). Make note of the following values, which you use to define the linked service:
+
+ - Application ID
+ - Application key
+ - Tenant ID
+
+2. Grant the service principal proper permission. See examples on how permission works in Cosmos DB from [Access control lists on files and directories](../cosmos-db/how-to-setup-rbac.md). More specifically, create a role definition, and assign the role to the service principle via service principle object ID.
+
+These properties are supported for the linked service:
+
+| Property | Description | Required |
+|: |: |: |
+| type | The type property must be set to **CosmosDb**. |Yes |
+| accountEndpoint | Specify the account endpoint URL for the Azure Cosmos DB. | Yes |
+| database | Specify the name of the database. | Yes |
+| servicePrincipalId | Specify the application's client ID. | Yes |
+| servicePrincipalCredentialType | The credential type to use for service principal authentication. Allowed values are **ServicePrincipalKey** and **ServicePrincipalCert**. | Yes |
+| servicePrincipalCredential | The service principal credential. <br/> When you use **ServicePrincipalKey** as the credential type, specify the the application's key. Mark this field as **SecureString** to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). <br/> When you use **ServicePrincipalCert** as the credential, reference a certificate in Azure Key Vault. | Yes |
+| servicePrincipalKey | Specify the application's key. Mark this field as **SecureString** to store it securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). | No |
+| tenant | Specify the tenant information (domain name or tenant ID) under which your application resides. Retrieve it by hovering the mouse in the upper-right corner of the Azure portal. | Yes |
+| azureCloudType | For service principal authentication, specify the type of Azure cloud environment to which your Azure Active Directory application is registered. <br/> Allowed values are **AzurePublic**, **AzureChina**, **AzureUsGovernment**, and **AzureGermany**. By default, the data factory's cloud environment is used. | No |
+| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or a self-hosted integration runtime if your data store is in a private network. If not specified, the default Azure integration runtime is used. |No |
+
+**Example: using service principal key authentication**
+
+You can also store service principal key in Azure Key Vault.
+
+```json
+{
+ "name": "CosmosDbSQLAPILinkedService",
+ "properties": {
+ "type": "CosmosDb",
+ "typeProperties": {
+ "accountEndpoint": "<account endpoint>",
+ "database": "<database name>",
+ "servicePrincipalId": "<service principal id>",
+ "servicePrincipalCredentialType": "ServicePrincipalKey",
+ "servicePrincipalCredential": {
+ "type": "SecureString",
+ "value": "<service principal key>"
+ },
+ "tenant": "<tenant info, e.g. microsoft.onmicrosoft.com>"
+ },
+ "connectVia": {
+ "referenceName": "<name of Integration Runtime>",
+ "type": "IntegrationRuntimeReference"
+ }
+ }
+}
+```
+
+**Example: using service principal certificate authentication**
+```json
+{
+ "name": "CosmosDbSQLAPILinkedService",
+ "properties": {
+ "type": "CosmosDb",
+ "typeProperties": {
+ "accountEndpoint": "<account endpoint>",
+ "database": "<database name>",
+ "servicePrincipalId": "<service principal id>",
+ "servicePrincipalCredentialType": "ServicePrincipalCert",
+ "servicePrincipalCredential": {
+ "type": "AzureKeyVaultSecret",
+ "store": {
+ "referenceName": "<AKV reference>",
+ "type": "LinkedServiceReference"
+ },
+ "secretName": "<certificate name in AKV>"
+ },
+ "tenant": "<tenant info, e.g. microsoft.onmicrosoft.com>"
+ },
+ "connectVia": {
+ "referenceName": "<name of Integration Runtime>",
+ "type": "IntegrationRuntimeReference"
+ }
+ }
+}
+```
+
+### <a name="managed-identity"></a> Managed identities for Azure resources authentication (Preview)
+
+>[!NOTE]
+>Currently, the managed identity authentication is not supported in data flow.
+
+A data factory can be associated with a [managed identity for Azure resources](data-factory-service-identity.md), which represents this specific data factory. You can directly use this managed identity for Cosmos DB authentication, similar to using your own service principal. It allows this designated factory to access and copy data to or from your Cosmos DB.
+
+To use managed identities for Azure resource authentication, follow these steps.
+
+1. [Retrieve the Data Factory managed identity information](data-factory-service-identity.md#retrieve-managed-identity) by copying the value of the **managed identity object ID** generated along with your factory.
+
+2. Grant the managed identity proper permission. See examples on how permission works in Cosmos DB from [Access control lists on files and directories](../cosmos-db/how-to-setup-rbac.md). More specifically, create a role definition, and assign the role to the managed identity.
+
+These properties are supported for the linked service:
+
+| Property | Description | Required |
+|: |: |: |
+| type | The type property must be set to **CosmosDb**. |Yes |
+| accountEndpoint | Specify the account endpoint URL for the Azure Cosmos DB. | Yes |
+| database | Specify the name of the database. | Yes |
+| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or a self-hosted integration runtime if your data store is in a private network. If not specified, the default Azure integration runtime is used. |No |
+
+**Example:**
+
+```json
+{
+ "name": "CosmosDbSQLAPILinkedService",
+ "properties": {
+ "type": "CosmosDb",
+ "typeProperties": {
+ "accountEndpoint": "<account endpoint>",
+ "database": "<database name>"
+ },
+ "connectVia": {
+ "referenceName": "<name of Integration Runtime>",
+ "type": "IntegrationRuntimeReference"
+ }
+ }
+}
+```
+ ## Dataset properties For a full list of sections and properties that are available for defining datasets, see [Datasets and linked services](concepts-datasets-linked-services.md).
data-factory Connector Dynamics Crm Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-dynamics-crm-office-365.md
Title: Copy data in Dynamics (Common Data Service)
-description: Learn how to copy data from Microsoft Dynamics CRM or Microsoft Dynamics 365 (Common Data Service/Microsoft Dataverse) to supported sink data stores or from supported source data stores to Dynamics CRM or Dynamics 365 by using a copy activity in a data factory pipeline.
+ Title: Copy data in Dynamics (Microsoft Dataverse)
+description: Learn how to copy data from Microsoft Dynamics CRM or Microsoft Dynamics 365 (Microsoft Dataverse) to supported sink data stores or from supported source data stores to Dynamics CRM or Dynamics 365 by using a copy activity in a data factory pipeline.
Last updated 03/17/2021 -
-# Copy data from and to Dynamics 365 (Common Data Service/Microsoft Dataverse) or Dynamics CRM by using Azure Data Factory
+# Copy data from and to Dynamics 365 (Microsoft Dataverse) or Dynamics CRM by using Azure Data Factory
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]- This article outlines how to use a copy activity in Azure Data Factory to copy data from and to Microsoft Dynamics 365 and Microsoft Dynamics CRM. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of a copy activity. ## Supported capabilities
This connector is supported for the following activities:
- [Copy activity](copy-activity-overview.md) with [supported source and sink matrix](copy-activity-overview.md) - [Lookup activity](control-flow-lookup-activity.md)
-You can copy data from Dynamics 365 (Common Data Service/Microsoft Dataverse) or Dynamics CRM to any supported sink data store. You also can copy data from any supported source data store to Dynamics 365 (Common Data Service) or Dynamics CRM. For a list of data stores that a copy activity supports as sources and sinks, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
+You can copy data from Dynamics 365 (Microsoft Dataverse) or Dynamics CRM to any supported sink data store. You also can copy data from any supported source data store to Dynamics 365 (Microsoft Dataverse) or Dynamics CRM. For a list of data stores that a copy activity supports as sources and sinks, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
-This Dynamics connector supports Dynamics versions 7 through 9 for both online and on-premises. More specifically:
+>[!NOTE]
+>Effective November 2020, Common Data Service has been renamed to [Microsoft Dataverse](/powerapps/maker/data-platform/data-platform-intro). This article is updated to reflect the latest terminology.
+This Dynamics connector supports Dynamics versions 7 through 9 for both online and on-premises. More specifically:
- Version 7 maps to Dynamics CRM 2015. - Version 8 maps to Dynamics CRM 2016 and the early version of Dynamics 365. - Version 9 maps to the later version of Dynamics 365. + Refer to the following table of supported authentication types and configurations for Dynamics versions and products. | Dynamics versions | Authentication types | Linked service samples | |: |: |: |
-| Common Data Service <br/><br/> Dynamics 365 online <br/><br/> Dynamics CRM online | Azure Active Directory (Azure AD) service principal <br/><br/> Office 365 | [Dynamics online and Azure AD service-principal or Office 365 authentication](#dynamics-365-and-dynamics-crm-online) |
+| Dataverse <br/><br/> Dynamics 365 online <br/><br/> Dynamics CRM online | Azure Active Directory (Azure AD) service principal <br/><br/> Office 365 | [Dynamics online and Azure AD service-principal or Office 365 authentication](#dynamics-365-and-dynamics-crm-online) |
| Dynamics 365 on-premises with internet-facing deployment (IFD) <br/><br/> Dynamics CRM 2016 on-premises with IFD <br/><br/> Dynamics CRM 2015 on-premises with IFD | IFD | [Dynamics on-premises with IFD and IFD authentication](#dynamics-365-and-dynamics-crm-on-premises-with-ifd) |
+> [!IMPORTANT]
+>If your tenant and user is configured in Azure Active Directory for [conditional access](/azure/active-directory/conditional-access/overview) and/or Multi-Factor Authentication is required, you will not be able to use Office 365 Authentication type. For those situations, you must use a Azure Active Directory (Azure AD) service principal authentication.
For Dynamics 365 specifically, the following application types are supported:- - Dynamics 365 for Sales - Dynamics 365 for Customer Service - Dynamics 365 for Field Service - Dynamics 365 for Project Service Automation - Dynamics 365 for Marketing- This connector doesn't support other application types like Finance, Operations, and Talent. >[!TIP]
This connector doesn't support other application types like Finance, Operations,
This Dynamics connector is built on top of [Dynamics XRM tooling](/dynamics365/customer-engagement/developer/build-windows-client-applications-xrm-tools). ## Prerequisites
+To use this connector with Azure AD service-principal authentication, you must set up server-to-server (S2S) authentication in Dataverse or Dynamics. First register the application user (Service Principal) in Azure Active Directory. You can find out how to do this [here](/azure/active-directory/develop/howto-create-service-principal-portal). During application registration you will need to create that user in Dataverse or Dynamics and grant permissions. Those permissions can either be granted directly or indirectly by adding the application user to a team which has been granted permissions in Dataverse or Dynamics. You can find more information on how to set up an application user to authenticate with Dataverse [here](/powerapps/developer/data-platform/use-single-tenant-server-server-authentication).
-To use this connector with Azure AD service-principal authentication, you must set up server-to-server (S2S) authentication in Common Data Service or Dynamics. Refer to [this article](/powerapps/developer/common-data-service/build-web-applications-server-server-s2s-authentication) for detailed steps.
## Get started
The following properties are supported for the Dynamics linked service.
} } ```+ #### Example: Dynamics online using Azure AD service-principal and certificate authentication ```json
The following properties are supported for the Dynamics linked service.
} } ```- #### Example: Dynamics online using Office 365 authentication ```json
If all of your source records map to the same target entity and your source data
To learn details about the properties, see [Lookup activity](control-flow-lookup-activity.md). ## Next steps
-For a list of data stores the copy activity in Data Factory supports as sources and sinks, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+
+For a list of data stores the copy activity in Data Factory supports as sources and sinks, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Connector Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-troubleshoot-guide.md
Azure Cosmos DB calculates RUs, see [Request units in Azure Cosmos DB](../cosmos
| :-- | :-- | | For Azure SQL, if the error message contains the string "SqlErrorNumber=47073", it means that public network access is denied in the connectivity setting. | On the Azure SQL firewall, set the **Deny public network access** option to *No*. For more information, see [Azure SQL connectivity settings](../azure-sql/database/connectivity-settings.md#deny-public-network-access). | | For Azure SQL, if the error message contains an SQL error code such as "SqlErrorNumber=[errorcode]", see the Azure SQL troubleshooting guide. | For a recommendation, see [Troubleshoot connectivity issues and other errors with Azure SQL Database and Azure SQL Managed Instance](../azure-sql/database/troubleshoot-common-errors-issues.md). |
- | Check to see whether port 1433 is in the firewall allow list. | For more information, see [Ports used by SQL Server](/sql/sql-server/install/configure-the-windows-firewall-to-allow-sql-server-access#ports-used-by-). |
+ | Check to see whether port 1433 is in the firewall allowlist. | For more information, see [Ports used by SQL Server](/sql/sql-server/install/configure-the-windows-firewall-to-allow-sql-server-access#ports-used-by-). |
| If the error message contains the string "SqlException", SQL Database the error indicates that some specific operation failed. | For more information, search by SQL error code in [Database engine errors](/sql/relational-databases/errors-events/database-engine-events-and-errors). For further help, contact Azure SQL support. | | If this is a transient issue (for example, an instable network connection), add retry in the activity policy to mitigate. | For more information, see [Pipelines and activities in Azure Data Factory](./concepts-pipelines-activities.md#activity-policy). | | If the error message contains the string "Client with IP address '...' is not allowed to access the server", and you're trying to connect to Azure SQL Database, the error is usually caused by an Azure SQL Database firewall issue. | In the Azure SQL Server firewall configuration, enable the **Allow Azure services and resources to access this server** option. For more information, see [Azure SQL Database and Azure Synapse IP firewall rules](../azure-sql/database/firewall-configure.md). |
Azure Cosmos DB calculates RUs, see [Request units in Azure Cosmos DB](../cosmos
| If your source is a folder, the files under the specified folder might have a different schema. | Make sure that the files in the specified folder have an identical schema. |
-## Dynamics 365, Common Data Service, and Dynamics CRM
+## Dynamics 365, Dataverse (Common Data Service), and Dynamics CRM
### Error code: DynamicsCreateServiceClientError
Azure Cosmos DB calculates RUs, see [Request units in Azure Cosmos DB](../cosmos
- **Message**: `Failed to connect to Dynamics: %message;`
+ - **Cause**: You are seeing `Unable to Login to Dynamics CRM, message:ERROR REQUESTING Token FROM THE Authentication context - USER intervention required but not permitted by prompt behavior
+AADSTS50079: Due to a configuration change made by your administrator, or because you moved to a new location, you must enroll in multi-factor authentication to access '00000007-0000-0000-c000-000000000000'` if your use case meets **all** of the following three conditions:
+ - You are connecting to Dynamics 365, Common Data Service, or Dynamics CRM.
+ - You are using Office365 Authentication.
+ - Your tenant and user is configured in Azure Active Directory for [conditional access](/azure/active-directory/conditional-access/overview) and/or Multi-Factor Authentication is required (see this [link](/powerapps/developer/data-platform/authenticate-office365-deprecation) to Dataverse doc).
+
+ Under these circumstances, the connection used to succeed before 6/8/2021.
+ Starting 6/9/2021 connection will start to fail because of the deprecation of regional Discovery Service (see this [link](/power-platform/important-changes-coming#regional-discovery-service-is-deprecated)).
+
+ - **Recommendation**:
+ If your tenant and user is configured in Azure Active Directory for [conditional access](/azure/active-directory/conditional-access/overview) and/or Multi-Factor Authentication is required, you must use ΓÇÿAzure AD service-principalΓÇÖ to authenticate after 6/8/2021. Refer this [link](/azure/data-factory/connector-dynamics-crm-office-365#prerequisites) for detailed steps.
+ - **Cause**: If you see `Office 365 auth with OAuth failed` in the error message, it means that your server might have some configurations not compatible with OAuth.
Azure Cosmos DB calculates RUs, see [Request units in Azure Cosmos DB](../cosmos
If you want to promote the low throughput, contact your SFTP administrator to increase the concurrent connection count limit, or you can do one of the following:
- * If you're using Self-hosted IR, add the Self-hosted IR machine's IP to the allow list.
- * If you're using Azure IR, add [Azure Integration Runtime IP addresses](./azure-integration-runtime-ip-addresses.md). If you don't want to add a range of IPs to the SFTP server allow list, use Self-hosted IR instead.
+ * If you're using Self-hosted IR, add the Self-hosted IR machine's IP to the allowlist.
+ * If you're using Azure IR, add [Azure Integration Runtime IP addresses](./azure-integration-runtime-ip-addresses.md). If you don't want to add a range of IPs to the SFTP server allowlist, use Self-hosted IR instead.
## SharePoint Online list
For more troubleshooting help, try these resources:
* [Azure videos](https://azure.microsoft.com/resources/videos/index/?sort=newest&services=data-factory) * [Microsoft Q&A page](/answers/topics/azure-data-factory.html) * [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
-* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
+* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
data-factory Data Flow Sink https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-sink.md
For example, if I specify a single key column of `column1` in a cache sink calle
> [!NOTE] > A cache sink must be in a completely independent data stream from any transformation referencing it via a cache lookup. A cache sink also must the first sink written.
+**Write to activity output** The cached sink can optionally write your output data to the input of the next pipeline activity. This will allow you to quickly and easily pass data out of your data flow activity without needing to persist the data in a data store.
+ ## Field mapping Similar to a select transformation, on the **Mapping** tab of the sink, you can decide which incoming columns will get written. By default, all input columns, including drifted columns, are mapped. This behavior is known as *automapping*.
data-factory Data Flow Troubleshoot Connector Format https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-troubleshoot-connector-format.md
Previously updated : 04/20/2021 Last updated : 05/18/2021
To overwrite the default behavior and bring in additional fields, ADF provides o
![Screenshot that shows the second option to customize the source schema.](./media/data-flow-troubleshoot-connector-format/customize-schema-option-2.png)
+## CDM
+
+### Model.Json files with special characters
+
+#### Symptoms
+You may encounter an issue that the final name of the model.json file contains special characters.  
+
+#### Error message  
+`at Source 'source1': java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: PPDFTable1.csv@snapshot=2020-10-21T18:00:36.9469086Z. ` 
+
+#### Recommendation  
+Replace the special chars in the file name, which will work in the synapse but not in ADF.  
+
+### No data output in the data preview or after running pipelines
+
+#### Symptoms
+When you use the manifest.json for CDM, no data is shown in the data preview or shown after running a pipeline. Only headers are shown. You can see this issue in the picture below.<br/>
+
+![Screenshot that shows the no data output symptom.](./media/data-flow-troubleshoot-connector-format/no-data-output.png)
+
+#### Cause
+The manifest document describes the CDM folder, for example, what entities that you have in the folder, references of those entities and the data that corresponds to this instance. Your manifest document misses the `dataPartitions` information that indicates ADF where to read the data, and  since it is empty, it returns zero data. 
+
+#### Recommendation
+Update your manifest document to have the `dataPartitions` information, and you can refer to this example manifest document to update your document: [Common Data Model metadata: Introducing manifest-Example manifest document](https://docs.microsoft.com/common-data-model/cdm-manifest#example-manifest-document).
+
+### JSON array attributes are inferred as separate columns
+
+#### Symptoms 
+You may encounter an issue where one attribute (string type) of the CDM entity has a JSON array as data. When this data is encountered, ADF infers the data as separate columns incorrectly. As you can see from the following pictures, a single attribute presented in the source (msfp_otherproperties) is inferred as a separate column in the CDM connector’s preview.<br/> 
+
+- In the CSV source data (refer to the second column): <br/>
+
+ ![Screenshot that shows the attribute in the CSV source data.](./media/data-flow-troubleshoot-connector-format/json-array-csv.png)
+
+- In the CDM source data preview: <br/>
+
+ ![Screenshot that shows the separate column in the CDM source data.](./media/data-flow-troubleshoot-connector-format/json-array-cdm.png)
+
+ 
+You may also try to map drifted columns and use the data flow expression to transform this attribute as an array. But since this attribute is read as a separate column when reading, transforming to an array does not work.  
+
+#### Cause
+This issue is likely caused by the commas within your JSON object value for that column. Since your data file is expected to be a CSV file, the comma indicates that it is the end of a column’s value.
+
+#### Recommendation
+To solve this problem, you need to double quote your JSON column and avoid any of the inner quotes with a backslash (`\`). In this way, the contents of that column’s value can be read in as a single column entirely.  
+  
+>[!Note]
+>The CDM doesn’t inform that the data type of the column value is JSON, yet it informs that it is a string and parsed as such.
+
+### Unable to fetch data in the data flow preview
+
+#### Symptoms
+You use CDM with model.json generated by Power BI. When you preview the CDM data using the data flow preview, you encounter an error: `No output data.`
+
+#### Cause
+ The following code exists in the partitions in the model.json file generated by the Power BI data flow.
+```json
+"partitions": [  
+{  
+"name": "Part001",  
+"refreshTime": "2020-10-02T13:26:10.7624605+00:00",  
+"location": "https://datalakegen2.dfs.core.windows.net/powerbi/salesEntities/salesPerfByYear.csv @snapshot=2020-10-02T13:26:10.6681248Z"  
+}  
+```
+For this model.json file, the issue is the naming schema of the data partition file has special characters, and supporting file paths with '@' do not exist currently.  
+
+#### Recommendation
+Please remove the `@snapshot=2020-10-02T13:26:10.6681248Z` part from the data partition file name and the model.json file, and then try again.
+
+### The corpus path is null or empty
+
+#### Symptoms
+When you use CDM in the data flow with the model format, you cannot preview the data, and you encounter the error: `DF-CDM_005 The corpus path is null or empty`. The error is shown in the following picture:  
+
+![Screenshot that shows the corpus path error.](./media/data-flow-troubleshoot-connector-format/corpus-path-error.png)
+
+#### Cause
+Your data partition path in the model.json is pointing to a blob storage location and not your data lake. The location should have the base URL of **.dfs.core.windows.net** for the ADLS Gen2. 
+
+#### Recommendation
+To solve this issue, you can refer to this article: [ADF Adds Support for Inline Datasets and Common Data Model to Data Flows](https://techcommunity.microsoft.com/t5/azure-data-factory/adf-adds-support-for-inline-datasets-and-common-data-model-to/ba-p/1441798), and the following picture shows the way to fix the corpus path error in this article.
+
+![Screenshot that shows how to fix the corpus path error.](./media/data-flow-troubleshoot-connector-format/fix-format-issue.png)
+
+### Unable to read CSV data files
+
+#### Symptoms 
+You use the inline dataset as the common data model with manifest as a source, and you have provided the entry manifest file, root path, entity name and path. In the manifest, you have the data partitions with the CSV file location. Meanwhile, the entity schema and csv schema are identical, and all validations were successful. However, in the data preview, only the schema rather than the data gets loaded and the data is invisible, which is shown in the following picture:
+
+![Screenshot that shows the issue of unable to read data files.](./media/data-flow-troubleshoot-connector-format/unable-read-data.png)
+
+#### Cause
+Your CDM folder is not separated into logical and physical models, and only physical models exist in the CDM folder. The following two articles describe the difference: [Logical definitions](https://docs.microsoft.com/common-data-model/sdk/logical-definitions) and [Resolving a logical entity definition](https://docs.microsoft.com/common-data-model/sdk/convert-logical-entities-resolved-entities).<br/> 
+
+#### Recommendation
+For the data flow using CDM as a source, try to use a logical model as your entity reference, and use the manifest that describes the location of the physical resolved entities and the data partition locations. You can see some samples of logical entity definitions within the public CDM github repository: [CDM-schemaDocuments](https://github.com/microsoft/CDM/tree/master/schemaDocuments)<br/>
+
+A good starting point to forming your corpus is to copy the files within the schema documents folder (just that level inside the github repository), and put those files into a folder. Afterwards, you can use one of the predefined logical entities within the repository (as a starting or reference point) to create your logical model.<br/>
+
+Once the corpus is set up, you are recommended to use CDM as a sink within data flows, so that a well-formed CDM folder can be properly created. You can use your CSV dataset as a source and then sink it to your CDM model that you created.
+
+## Delta
+
+### The sink does not support the schema drift with upsert or update
+
+#### Symptoms
+You may face the issue that the delta sink in mapping data flows does not support schema drift with upsert/update. The problem is that the schema drift does not work when the delta is the target in a mapping data flow and user configure an update/upsert. 
+
+If a column is added to the source after an "initial" load to the delta, the subsequent jobs just fail with an error that it cannot find the new column, and this happens when you upsert/update with the alter row. It seems to work for inserts only.
+
+#### Error message
+`DF-SYS-01 at Sink 'SnkDeltaLake': org.apache.spark.sql.AnalysisException: cannot resolve target.BICC_RV in UPDATE clause given columns target. `
+
+#### Cause
+This is an issue for delta format because of the limitation of io delta library used in the data flow runtime. This issue is still in fixing.
+
+#### Recommendation
+To solve this problem, you need to update the schema firstly and then write the data. You can follow the steps below: <br/>
+1. Create one data flow that includes an insert-only delta sink with the merge schema option to update the schema. 
+1. After Step 1, use delete/upsert/update to modify the target sink without changing the schema. <br/>
+
+## Azure PostgreSQL
+
+### Encounter an error: Failed with exception: handshake_failure
+
+#### Symptoms
+You use Azure PostgreSQL as a source or sink in the data flow such as previewing data and debugging/triggering run, and you may find the job fails with following error message:
+
+ `PSQLException: SSL error: Received fatal alert: handshake_failure `<br/>
+ `Caused by: SSLHandshakeException: Received fatal alert: handshake_failure.`
+
+#### Cause
+If you use the flexible server or Hyperscale (Citus) for your Azure PostgreSQL server, since the system is built via Spark upon Azure Databricks cluster, there is a limitation in Azure Databricks blocks our system to connect to the Flexible server or Hyperscale (Citus). You can review the following two links as references.
+- [Handshake fails trying to connect from Azure Databricks to Azure PostgreSQL with SSL](https://docs.microsoft.com/answers/questions/170730/handshake-fails-trying-to-connect-from-azure-datab.html)
+
+- [MCW-Real-time-data-with-Azure-Database-for-PostgreSQL-Hyperscale](https://github.com/microsoft/MCW-Real-time-data-with-Azure-Database-for-PostgreSQL-Hyperscale/blob/master/Hands-on%20lab/HOL%20step-by%20step%20-%20Real-time%20data%20with%20Azure%20Database%20for%20PostgreSQL%20Hyperscale.md)<br/>
+ Refer to the content in the following picture in this article:<br/>
+
+ ![Screenshots that shows the referring content in the article above.](./media/data-flow-troubleshoot-connector-format/handshake-failure-cause-2.png)
+
+#### Recommendation
+You can try to use copy activities to unblock this issue.
+
+## CSV and Excel
+
+### Set the quote character to 'no quote char' is not supported in the CSV
+
+#### Symptoms
+
+There are several issues that are not supported in the CSV when the quote character is set to 'no quote char':
+
+1. When the quote character is set to 'no quote char', multi-char column delimiter can't start and end with the same letters.
+2. When the quote character is set to 'no quote char', multi-char column delimiter can't contain the escape character: `\`.
+3. When the quote character is set to 'no quote char', column value can't contain row delimiter.
+4. The quote character and the escape character cannot both be empty (no quote and no escape) if the column value contains a column delimiter.
+
+#### Cause
+
+Causes of the symptoms are stated below with examples respectively:
+1. Start and end with the same letters.<br/>
+`column delimiter: $*^$*`<br/>
+`column value: abc$*^ def`<br/>
+`csv sink: abc$*^$*^$*def ` <br/>
+`will be read as "abc" and "^&*def"`<br/>
+
+2. The multi-char delimiter contains escape characters.<br/>
+`column delimiter: \x`<br/>
+`escape char:\`<br/>
+`column value: "abc\\xdef"`<br/>
+The escape character will either escape the column delimiter or the escape the character.
+
+3. The column value contains the row delimiter. <br/>
+`We need quote character to tell if row delimiter is inside column value or not.`
+
+4. The quote character and the escape character both be empty and the column value contains column delimiters.<br/>
+`Column delimiter: \t`<br/>
+`column value: 111\t222\t33\t3`<br/>
+`It will be ambigious if it contains 3 columns 111,222,33\t3 or 4 columns 111,222,33,3.`<br/>
+
+#### Recommendation
+The first symptom and the second symptom cannot be solved currently. For the third and fourth symptoms, you can apply the following methods:
+- For Symptom 3, do not use the 'no quote char' for a multiline csv file.
+- For Symptom 4, set either the quote character or the escape character as non-empty, or you can remove all column delimiters inside your data.
+
+### Read files with different schemas error
+
+#### Symptoms
+
+When you use data flows to read files such as CSV and Excel files with different schemas, the data flow debug, sandbox or activity run will fail.
+- For CSV, the data misalignment exists when the schema of files is different.
+
+ ![Screenshot that shows the first schema error.](./media/data-flow-troubleshoot-connector-format/schema-error-1.png)
+
+- For Excel, an error occurs when the schema of the file is different.
+
+ ![Screenshot that shows the second schema error.](./media/data-flow-troubleshoot-connector-format/schema-error-2.png)
+
+#### Cause
+
+Reading files with different schemas in the data flow is not supported.
+
+#### Recommendation
+
+If you still want to transfer files such as CSV and Excel files with different schemas in the data flow, you can use the ways below to work around:
+
+- For CSV, you need to manually merge the schema of different files to get the full schema. For example, file_1 has columns `c_1, c_2, c_3` while file_2 has columns `c_3, c_4,... c_10`, so the merged and the full schema is `c_1, c_2... c_10`. Then make other files also have the same full schema even though it does not have data, for example, file_x only has columns `c_1, c_2, c_3, c_4`, please add additional columns `c_5, c_6, ... c_10` in the file, then it can work.
+
+- For Excel, you can solve this issue by applying one of the following options:
+
+ - **Option-1**: You need to manually merge the schema of different files to get the full schema. For example, file_1 has columns `c_1, c_2, c_3` while file_2 has columns `c_3, c_4,... c_10`, so the merged and full schema is `c_1, c_2... c_10`. Then make other files also have the same schema even though it does not have data, for example, file_x with sheet "SHEET_1" only has columns `c_1, c_2, c_3, c_4`, please add additional columns `c_5, c_6, ... c_10` in the sheet too, and then it can work.
+ - **Option-2**: Use **range (for example, A1:G100) + firstRowAsHeader=false**, and then it can load data from all Excel files even though the column name and count is different.
+
+## Azure Synapse Analytics
+
+### Serverless pool (SQL on-demand) related issues
+
+#### Symptoms
+You use the Azure Synapse Analytics and the linked service actually is a Synapse serverless pool. It's former named is SQL on-demand pool, and it can be distinguished by the server name contains `ondemand`, for example, `space-ondemand.sql.azuresynapse.net`. You may face with several unique failures as below:<br/>
+
+1. When you want to use Synapse serverless pool as a Sink, you face the following error:<br/>
+`Sink results in 0 output columns. Please ensure at least one column is mapped`
+1. When you select 'enable staging' in the Source, you face the following error:
+`shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near 'IDENTITY'.`
+1. When you want to fetch data from an external table, you face the following error: `shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: External table 'dbo' is not accessible because location does not exist or it is used by another process.`
+1. When you want to fetch data from Cosmos DB through Serverless pool by query/from view, you face the following error:
+ `Job failed due to reason: Connection reset.`
+1. When you want to fetch data from a view, you may face with different errors.
+
+#### Cause
+Causes of the symptoms are stated below respectively:
+1. Serverless pool cannot be used as a sink. It doesn't support write data into the database.
+1. Serverless pool doesn't support staged data loading, so 'enable staging' is not supported.
+1. The authentication method that you use doesn't have a correct permission to the external data source where the external table referring to.
+1. There is a known limitation in Synapse serverless pool, blocking you to fetch Cosmos DB data from data flows.
+1. View is a virtual table based on an SQL statement. The root cause is inside the statement of the view.
+
+#### Recommendation
+
+You can apply the following steps to solve your issues correspondingly.
+1. You should better not use serverless pool as a sink.
+1. Do not use 'enable staging' in Source for serverless pool.
+1. Only service principal/managed identity that has the permission to the external table data can query it. You should grant 'Storage Blob Data Contributor' permission to the external data source for the authentication method that you use in the ADF.
+ >[!Note]
+ > The user-password authentication can not query external tables. You can refer to this article for more information: [Security model](https://docs.microsoft.com/azure/synapse-analytics/metadata/database#security-model).
+
+1. You can use copy activity to fetch Cosmos DB data from the serverless pool.
+1. You can provide the SQL statement which creates the view to the engineering support team, and they can help analyze if the statement hits an authentication issue or something else.
++
+### Load small size data to Data Warehouse without staging is slow
+
+#### Symptoms
+When you load small data to Data Warehouse without staging, it will take a long time to finish. For example, the data size 2 MB but it takes more than 1 hour to finish.
+
+#### Cause
+This issue is caused by the row count rather than the size. The row count has few thousand, and each insert needs to be packaged into an independent request, go to the control node, start a new transaction, get locks and go to the distribution node repeatedly. Bulk load gets the lock once, and each distribution node performs the insert by batching into memory efficiently.
+
+If 2 MB is inserted as just a few records, it would be fast. For example, it would be fast if each record is 500 kb * 4 rows.
+
+#### Recommendation
+You need to enable staging to improve the performance.
++
+### Read empty string value ("") as NULL with the enable staging
+
+#### Symptoms
+When you use Synapse as a source in the data flow such as previewing data and debugging/triggering run and enable staging to use the PolyBase, if your column value contains empty string value (`""`), it will be changed to null.
+
+#### Cause
+The data flow back end uses Parquet as the PolyBase format, and there is a known limitation in the Synapse SQL pool gen2, which will automatically change the empty string value to null.
+
+#### Recommendation
+You can try to solve this issue by the following methods:
+1. If your data size is not huge, you can disable **Enable staging** in the Source, but the performance will be affected.
+1. If you need to enable staging, you can use **iifNull()** function to manually change the specific column from null to empty string value.
+
+### Managed service identity error
+
+#### Symptoms
+When you use the Synapse as a source/sink in the data flow to preview data, debug/trigger run, etc. and enable staging to use the PolyBase, and the staging store's linked service (Blob, Gen2, etc.) is created to use the Managed Identity (MI) authentication, your job could fail with the following error shown in the picture: <br/>
+
+![Screenshots that shows the service identity error.](./media/data-flow-troubleshoot-connector-format/service-identity-error.png)
+
+#### Error message
+`shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: Managed Service Identity has not been enabled on this server. Please enable Managed Service Identity and try again.`
+
+#### Cause
+1. If the SQL pool is created from Synapse workspace, MI authentication on staging store with the PolyBase is not supported for the old SQL pool.
+1. If the SQL pool is the old Data Warehouse (DWH) version, MI of the SQL server is not assigned to the staging store.
+
+#### Recommendation
+You need to confirm if the SQL pool is created from the Synapse workspace.
+
+- If the SQL pool is created from the Synapse workspace, you need to re-register the MI of the workspace. You can apply the following steps to work around this issue by re-registering the workspace's MI:
+ 1. Go to your Synapse workspace in the Azure portal.
+ 1. Go to the **managed identities** blade.
+ 1. If the **Allow pipelines** option is already to be checked, you must uncheck this setting and save.
+ 1. Check the **Allow pipelines** option and save.
+
+- If the SQL pool is the old DWH version, only enable MI for your SQL server and assign the permission of the staging store to the MI of your SQL Server. You can refer to the steps in this article as an example: [Use virtual network service endpoints and rules for servers in Azure SQL Database](https://docs.microsoft.com/azure/azure-sql/database/vnet-service-endpoint-rule-overview#steps).
+ ## Next steps For more help with troubleshooting, see these resources:
data-factory Data Flow Tutorials https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-tutorials.md
Previously updated : 05/04/2021 Last updated : 05/18/2021 # Mapping data flow video tutorials
As updates are constantly made to the product, some features have added or diffe
[Transform complex data types](https://youtu.be/Wk0C76wnSDE)
+[Output to next activity](http://youtu.be/r1m3Ya14qpE?hd=1)
+ ## Source and sink [Reading and writing JSONs](https://www.youtube.com/watch?v=yY5aB7Kdhjg)
databox-online Azure Stack Edge 2101 Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-2101-release-notes.md
# Azure Stack Edge Pro with FPGA 2101 release notes
-The following release notes identify the critical open issues and the resolved issues for the 2101 release of Azure Stack Edge Pro with a built-in Field Programmable Gate Array (FPGA).
+The following release notes identify the critical open issues and the resolved issues for the 2101 release of Azure Stack Edge Pro FPGA with a built-in Field Programmable Gate Array (FPGA).
The release notes are continuously updated. As critical issues that require a workaround are discovered, they are added. Before you deploy your Azure Stack Edge device, carefully review the information in the release notes.
databox-online Azure Stack Edge Connect Powershell Interface https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-connect-powershell-interface.md
Title: Connect to and manage Microsoft Azure Stack Edge Pro device via the Windows PowerShell interface | Microsoft Docs
-description: Describes how to connect to and then manage Azure Stack Edge Pro via the Windows PowerShell interface.
+ Title: Connect to and manage Microsoft Azure Stack Edge Pro FPGA device via the Windows PowerShell interface | Microsoft Docs
+description: Describes how to connect to and then manage Azure Stack Edge Pro FPGA via the Windows PowerShell interface.
# Manage an Azure Stack Edge Pro FPGA device via Windows PowerShell
-Azure Stack Edge Pro solution lets you process data and send it over the network to Azure. This article describes some of the configuration and management tasks for your Azure Stack Edge Pro device. You can use the Azure portal, local web UI, or the Windows PowerShell interface to manage your device.
+Azure Stack Edge Pro FPGA solution lets you process data and send it over the network to Azure. This article describes some of the configuration and management tasks for your Azure Stack Edge Pro FPGA device. You can use the Azure portal, local web UI, or the Windows PowerShell interface to manage your device.
This article focuses on the tasks you do using the PowerShell interface.
To exit the remote PowerShell session, close the PowerShell window.
## Next steps -- Deploy [Azure Stack Edge Pro](azure-stack-edge-deploy-prep.md) in Azure portal.
+- Deploy [Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-prep.md) in Azure portal.
databox-online Azure Stack Edge Contact Microsoft Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-contact-microsoft-support.md
Title: Log support ticket for Azure Stack Edge Pro, Azure Data Box Gateway | Microsoft Docs
-description: Learn how to log support request for issues related to your Azure Stack Edge Pro or Data Box Gateway orders.
+ Title: Log support ticket for Azure Stack Edge, Azure Data Box Gateway | Microsoft Docs
+description: Learn how to log support request for issues related to your Azure Stack Edge or Data Box Gateway orders.
Last updated 03/05/2021
-# Open a support ticket for Azure Stack Edge Pro and Azure Data Box Gateway
+# Open a support ticket for Azure Stack Edge and Azure Data Box Gateway
[!INCLUDE [applies-to-GPU-and-pro-r-and-mini-r-databox-gateway-skus](../../includes/azure-stack-edge-applies-to-gpu-pro-r-mini-r-databox-gateway-sku.md)]
-This article applies to Azure Stack Edge Pro and Azure Data Box Gateway both of which are managed by the Azure Stack Edge Pro / Azure Data Box Gateway service. If you encounter any issues with your service, you can create a service request for technical support. This article walks you through:
+This article applies to Azure Stack Edge and Azure Data Box Gateway both of which are managed by the Azure Stack Edge / Azure Data Box Gateway service. If you encounter any issues with your service, you can create a service request for technical support. This article walks you through:
* How to create a support request. * How to manage a support request lifecycle from within the portal.
This article applies to Azure Stack Edge Pro and Azure Data Box Gateway both of
Do the following steps to create a support request:
-1. Go to your Azure Stack Edge Pro or Data Box Gateway order. Navigate to **Support + troubleshooting** section and then select **New support request**.
+1. Go to your Azure Stack Edge or Data Box Gateway order. Navigate to **Support + troubleshooting** section and then select **New support request**.
2. In **New support request**, on the **Basics** tab, take the following steps: 1. From the **Issue type** dropdown list, select **Technical**. 2. Choose your **Subscription**.
- 3. Under **Service**, check **My Services**. From the dropdown list, select **Azure Stack Edge Pro and Data Box Gateway**.
+ 3. Under **Service**, check **My Services**. From the dropdown list, select **Azure Stack Edge and Data Box Gateway**.
4. Select your **Resource**. This corresponds to the name of your order. 5. Give a brief **Summary** of the issue you are experiencing. 6. Select your **Problem type**.
After creating a support ticket, you can manage the lifecycle of the ticket from
## Next steps
-Learn how to [Troubleshoot issues related to Azure Stack Edge Pro](azure-stack-edge-troubleshoot.md).
+Learn how to [Troubleshoot issues related to Azure Stack Edge](azure-stack-edge-troubleshoot.md).
Learn how to [Troubleshoot issues related to Data Box Gateway](../databox-gateway/data-box-gateway-troubleshoot.md).
databox-online Azure Stack Edge Create Iot Edge Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-create-iot-edge-module.md
Title: C# IoT Edge module for Azure Stack Edge Pro | Microsoft Docs
-description: Learn how to develop a C# IoT Edge module that can be deployed on your Azure Stack Edge Pro.
+ Title: C# IoT Edge module for Azure Stack Edge Pro FPGA | Microsoft Docs
+description: Learn how to develop a C# IoT Edge module that can be deployed on your Azure Stack Edge Pro FPGA.
-# Develop a C# IoT Edge module to move files with Azure Stack Edge Pro
+# Develop a C# IoT Edge module to move files with Azure Stack Edge Pro FPGA
-This article steps you through how to create an IoT Edge module for deployment with your Azure Stack Edge Pro device. Azure Stack Edge Pro is a storage solution that allows you to process data and send it over network to Azure.
+This article steps you through how to create an IoT Edge module for deployment with your Azure Stack Edge Pro FPGA device. Azure Stack Edge Pro FPGA is a storage solution that allows you to process data and send it over network to Azure.
-You can use Azure IoT Edge modules with your Azure Stack Edge Pro to transform the data as it moved to Azure. The module used in this article implements the logic to copy a file from a local share to a cloud share on your Azure Stack Edge Pro device.
+You can use Azure IoT Edge modules with your Azure Stack Edge Pro FPGA to transform the data as it moved to Azure. The module used in this article implements the logic to copy a file from a local share to a cloud share on your Azure Stack Edge Pro FPGA device.
In this article, you learn how to: > [!div class="checklist"] > > * Create a container registry to store and manage your modules (Docker images).
-> * Create an IoT Edge module to deploy on your Azure Stack Edge Pro device.
+> * Create an IoT Edge module to deploy on your Azure Stack Edge Pro FPGA device.
## About the IoT Edge module
-Your Azure Stack Edge Pro device can deploy and run IoT Edge modules. Edge modules are essentially Docker containers that perform a specific task, such as ingest a message from a device, transform a message, or send a message to an IoT Hub. In this article, you will create a module that copies files from a local share to a cloud share on your Azure Stack Edge Pro device.
+Your Azure Stack Edge Pro FPGA device can deploy and run IoT Edge modules. Edge modules are essentially Docker containers that perform a specific task, such as ingest a message from a device, transform a message, or send a message to an IoT Hub. In this article, you will create a module that copies files from a local share to a cloud share on your Azure Stack Edge Pro FPGA device.
-1. Files are written to the local share on your Azure Stack Edge Pro device.
+1. Files are written to the local share on your Azure Stack Edge Pro FPGA device.
2. The file event generator creates a file event for each file written to the local share. The file events are also generated when a file is modified. The file events are then sent to IoT Edge Hub (in IoT Edge runtime). 3. The IoT Edge custom module processes the file event to create a file event object that also contains a relative path for the file. The module generates an absolute path using the relative file path and copies the file from the local share to the cloud share. The module then deletes the file from the local share.
-![How Azure IoT Edge module works on Azure Stack Edge Pro](./media/azure-stack-edge-create-iot-edge-module/how-module-works-1.png)
+![How Azure IoT Edge module works on Azure Stack Edge Pro FPGA](./media/azure-stack-edge-create-iot-edge-module/how-module-works-1.png)
Once the file is in the cloud share, it automatically gets uploaded to your Azure Storage account.
Once the file is in the cloud share, it automatically gets uploaded to your Azur
Before you begin, make sure you have: -- An Azure Stack Edge Pro device that is running.
+- An Azure Stack Edge Pro FPGA device that is running.
- The device also has an associated IoT Hub resource. - The device has Edge compute role configured.
- For more information, go to [Configure compute](azure-stack-edge-deploy-configure-compute.md#configure-compute) for your Azure Stack Edge Pro.
+ For more information, go to [Configure compute](azure-stack-edge-deploy-configure-compute.md#configure-compute) for your Azure Stack Edge Pro FPGA.
- The following development resources:
In the previous section, you created an IoT Edge solution and added code to the
## Next steps
-To deploy and run this module on Azure Stack Edge Pro, see the steps in [Add a module](azure-stack-edge-deploy-configure-compute.md#add-a-module).
+To deploy and run this module on Azure Stack Edge Pro FPGA, see the steps in [Add a module](azure-stack-edge-deploy-configure-compute.md#add-a-module).
databox-online Azure Stack Edge Deploy Add Shares https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-add-shares.md
Title: Tutorial to transfer data to shares with Azure Stack Edge Pro | Microsoft Docs
-description: In this tutorial, learn how to add and connect to shares on your Azure Stack Edge Pro device, so that Azure Stack Edge Pro can transfer data to Azure.
+ Title: Tutorial to transfer data to shares with Azure Stack Edge Pro FPGA | Microsoft Docs
+description: In this tutorial, learn how to add and connect to shares on your Azure Stack Edge Pro FPGA device, so that Azure Stack Edge Pro FPGA can transfer data to Azure.
Last updated 01/04/2021
-# Customer intent: As an IT admin, I need to understand how to add and connect to shares on Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to add and connect to shares on Azure Stack Edge Pro FPGA so I can use it to transfer data to Azure.
-# Tutorial: Transfer data with Azure Stack Edge Pro
+# Tutorial: Transfer data with Azure Stack Edge Pro FPGA
-This tutorial describes how to add and connect to shares on your Azure Stack Edge Pro device. After you've added the shares, Azure Stack Edge Pro can transfer data to Azure.
+This tutorial describes how to add and connect to shares on your Azure Stack Edge Pro FPGA device. After you've added the shares, Azure Stack Edge Pro FPGA can transfer data to Azure.
This procedure can take around 10 minutes to complete.
In this tutorial, you learn how to:
## Prerequisites
-Before you add shares to Azure Stack Edge Pro, make sure that:
+Before you add shares to Azure Stack Edge Pro FPGA, make sure that:
-- You've installed your physical device as described in [Install Azure Stack Edge Pro](azure-stack-edge-deploy-install.md).
+- You've installed your physical device as described in [Install Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-install.md).
-- You've activated the physical device as described in [Connect, set up, and activate Azure Stack Edge Pro](azure-stack-edge-deploy-connect-setup-activate.md).
+- You've activated the physical device as described in [Connect, set up, and activate Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-connect-setup-activate.md).
## Add a share
To create a share, do the following procedure:
c. Provide a storage account where the share will reside. > [!IMPORTANT]
- > Make sure that the Azure Storage account that you use does not have immutability policies set on it if you are using it with a Azure Stack Edge Pro or Data Box Gateway device. For more information, see [Set and manage immutability policies for blob storage](../storage/blobs/storage-blob-immutability-policies-manage.md).
+ > Make sure that the Azure Storage account that you use does not have immutability policies set on it if you are using it with a Azure Stack Edge Pro FPGA or Data Box Gateway device. For more information, see [Set and manage immutability policies for blob storage](../storage/blobs/storage-blob-immutability-policies-manage.md).
d. In the **Storage service** drop-down list, select **Block Blob**, **Page Blob**, or **Files**. The type of service you select depends on which format you want the data to use in Azure. In this example, because we want to store the data as block blobs in Azure, we select **Block Blob**. If you select **Page Blob**, make sure that your data is 512 bytes aligned. For example, a VHDX is always 512 bytes aligned.
You can now connect to one or more of the shares that you created in the last st
### Connect to an SMB share
-On your Windows Server client connected to your Azure Stack Edge Pro device, connect to an SMB share by entering the commands:
+On your Windows Server client connected to your Azure Stack Edge Pro FPGA device, connect to an SMB share by entering the commands:
1. In a command window, type:
On your Windows Server client connected to your Azure Stack Edge Pro device, con
### Connect to an NFS share
-On your Linux client connected to your Azure Stack Edge Pro device, do the following procedure:
+On your Linux client connected to your Azure Stack Edge Pro FPGA device, do the following procedure:
1. Make sure that the client has NFSv4 client installed. To install NFS client, use the following command:
On your Linux client connected to your Azure Stack Edge Pro device, do the follo
For more information, go to [Install NFSv4 client](https://help.ubuntu.com/community/SettingUpNFSHowTo#NFSv4_client).
-2. After the NFS client is installed, mount the NFS share that you created on your Azure Stack Edge Pro device by using the following command:
+2. After the NFS client is installed, mount the NFS share that you created on your Azure Stack Edge Pro FPGA device by using the following command:
`sudo mount -t nfs -o sec=sys,resvport <device IP>:/<NFS shares on device> /home/username/<Folder on local Linux computer>`
On your Linux client connected to your Azure Stack Edge Pro device, do the follo
> Use of `sync` option when mounting shares improves the transfer rates of large files. > Before you mount the share, make sure that the directories that will act as mountpoints on your local computer are already created. These directories should not contain any files or subfolders.
- The following example shows how to connect via NFS to a share on your Azure Stack Edge Pro device. The device IP is `10.10.10.60`. The share `mylinuxshare2` is mounted on the ubuntuVM. The share mount point is `/home/databoxubuntuhost/edge`.
+ The following example shows how to connect via NFS to a share on your Azure Stack Edge Pro FPGA device. The device IP is `10.10.10.60`. The share `mylinuxshare2` is mounted on the ubuntuVM. The share mount point is `/home/databoxubuntuhost/edge`.
`sudo mount -t nfs -o sec=sys,resvport 10.10.10.60:/mylinuxshare2 /home/databoxubuntuhost/Edge`
On your Linux client connected to your Azure Stack Edge Pro device, do the follo
## Next steps
-In this tutorial, you learned about the following Azure Stack Edge Pro topics:
+In this tutorial, you learned about the following Azure Stack Edge Pro FPGA topics:
> [!div class="checklist"] > * Add a share > * Connect to share
-To learn how to transform your data by using Azure Stack Edge Pro, advance to the next tutorial:
+To learn how to transform your data by using Azure Stack Edge Pro FPGA, advance to the next tutorial:
> [!div class="nextstepaction"]
-> [Transform data with Azure Stack Edge Pro](./azure-stack-edge-deploy-configure-compute.md)
+> [Transform data with Azure Stack Edge Pro FPGA](./azure-stack-edge-deploy-configure-compute.md)
databox-online Azure Stack Edge Deploy Configure Compute Advanced https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-configure-compute-advanced.md
Title: Tutorial to filter, analyze data for advanced deployment with compute on Azure Stack Edge Pro | Microsoft Docs
-description: Learn how to configure compute role on Azure Stack Edge Pro and use it to transform data for advanced deployment flow before sending to Azure.
+ Title: Tutorial to filter, analyze data for advanced deployment with compute on Azure Stack Edge Pro FPGA | Microsoft Docs
+description: Learn how to configure compute role on Azure Stack Edge Pro FPGA and use it to transform data for advanced deployment flow before sending to Azure.
Last updated 01/06/2021
-# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro for advanced deployment flow so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro FPGA for advanced deployment flow so I can use it to transform the data before sending it to Azure.
-# Tutorial: Transform data with Azure Stack Edge Pro for advanced deployment flow
+# Tutorial: Transform data with Azure Stack Edge Pro FPGA for advanced deployment flow
-This tutorial describes how to configure a compute role for an advanced deployment flow on your Azure Stack Edge Pro device. After you configure the compute role, Azure Stack Edge Pro can transform data before sending it to Azure.
+This tutorial describes how to configure a compute role for an advanced deployment flow on your Azure Stack Edge Pro FPGA device. After you configure the compute role, Azure Stack Edge Pro FPGA can transform data before sending it to Azure.
Compute can be configured for simple or advanced deployment flow on your device.
In this tutorial, you learn how to:
## Prerequisites
-Before you set up a compute role on your Azure Stack Edge Pro device, make sure that:
+Before you set up a compute role on your Azure Stack Edge Pro FPGA device, make sure that:
-- You've activated your Azure Stack Edge Pro device as described in [Connect, set up, and activate Azure Stack Edge Pro](azure-stack-edge-deploy-connect-setup-activate.md).
+- You've activated your Azure Stack Edge Pro FPGA device as described in [Connect, set up, and activate Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-connect-setup-activate.md).
## Configure compute
-To configure compute on your Azure Stack Edge Pro, you'll create an IoT Hub resource.
+To configure compute on your Azure Stack Edge Pro FPGA, you'll create an IoT Hub resource.
1. In the Azure portal of your Azure Stack Edge resource, go to **Overview**. In the right-pane, select the **IoT Edge** tile.
For the advanced deployment in this tutorial, you'll need two shares: one Edge s
## Add a module
-There are no custom modules on this Edge device. You could add a custom or a pre-built module. To learn how to create a custom module, go to [Develop a C# module for your Azure Stack Edge Pro device](azure-stack-edge-create-iot-edge-module.md).
+There are no custom modules on this Edge device. You could add a custom or a pre-built module. To learn how to create a custom module, go to [Develop a C# module for your Azure Stack Edge Pro FPGA device](azure-stack-edge-create-iot-edge-module.md).
-In this section, you add a custom module to the IoT Edge device that you created in [Develop a C# module for your Azure Stack Edge Pro](azure-stack-edge-create-iot-edge-module.md). This custom module takes files from an Edge local share on the Edge device and moves them to an Edge (cloud) share on the device. The cloud share then pushes the files to the Azure storage account that's associated with the cloud share.
+In this section, you add a custom module to the IoT Edge device that you created in [Develop a C# module for your Azure Stack Edge Pro FPGA](azure-stack-edge-create-iot-edge-module.md). This custom module takes files from an Edge local share on the Edge device and moves them to an Edge (cloud) share on the device. The cloud share then pushes the files to the Azure storage account that's associated with the cloud share.
1. Go to your Azure Stack Edge resource and then go to **IoT Edge > Overview**. On the **Modules** tile, select **Go to Azure IoT Hub**.
In this section, you add a custom module to the IoT Edge device that you created
4. Under **Add Modules**, do the following: 1. Enter the name, address, user name, and password for the container registry settings for the custom module.
- The name, address, and listed credentials are used to retrieve modules with a matching URL. To deploy this module, under **Deployment modules**, select **IoT Edge module**. This IoT Edge module is a docker container that you can deploy to the IoT Edge device that's associated with your Azure Stack Edge Pro device.
+ The name, address, and listed credentials are used to retrieve modules with a matching URL. To deploy this module, under **Deployment modules**, select **IoT Edge module**. This IoT Edge module is a docker container that you can deploy to the IoT Edge device that's associated with your Azure Stack Edge Pro FPGA device.
![The Set Modules page](./media/azure-stack-edge-deploy-configure-compute-advanced/add-module-4.png)
In this section, you add a custom module to the IoT Edge device that you created
|Field |Value | |||
- |Name | A unique name for the module. This module is a docker container that you can deploy to the IoT Edge device associated with your Azure Stack Edge Pro. |
+ |Name | A unique name for the module. This module is a docker container that you can deploy to the IoT Edge device associated with your Azure Stack Edge Pro FPGA. |
|Image URI | The image URI for the corresponding container image for the module. | |Credentials required | If checked, username and password are used to retrieve modules with a matching URL. |
In this tutorial, you learned how to:
> * Add a compute module > * Verify data transform and transfer
-To learn how to administer your Azure Stack Edge Pro device, see:
+To learn how to administer your Azure Stack Edge Pro FPGA device, see:
> [!div class="nextstepaction"]
-> [Use local web UI to administer a Azure Stack Edge Pro](azure-stack-edge-manage-access-power-connectivity-mode.md)
+> [Use local web UI to administer a Azure Stack Edge Pro FPGA](azure-stack-edge-manage-access-power-connectivity-mode.md)
databox-online Azure Stack Edge Deploy Configure Compute https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-configure-compute.md
Title: Tutorial to filter, analyze data with compute on Azure Stack Edge Pro | Microsoft Docs
-description: Learn how to configure compute role on Azure Stack Edge Pro and use it to transform data before sending to Azure.
+ Title: Tutorial to filter, analyze data with compute on Azure Stack Edge Pro FPGA | Microsoft Docs
+description: Learn how to configure compute role on Azure Stack Edge Pro FPGA and use it to transform data before sending to Azure.
Last updated 01/06/2021
-# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro FPGA so I can use it to transform the data before sending it to Azure.
-# Tutorial: Transform the data with Azure Stack Edge Pro
+# Tutorial: Transform the data with Azure Stack Edge Pro FPGA
-This tutorial describes how to configure a compute role on your Azure Stack Edge Pro device. After you configure the compute role, Azure Stack Edge Pro can transform data before sending it to Azure.
+This tutorial describes how to configure a compute role on your Azure Stack Edge Pro FPGA device. After you configure the compute role, Azure Stack Edge Pro FPGA can transform data before sending it to Azure.
This procedure can take around 10 to 15 minutes to complete.
In this tutorial, you learn how to:
## Prerequisites
-Before you set up a compute role on your Azure Stack Edge Pro device, make sure that:
+Before you set up a compute role on your Azure Stack Edge Pro FPGA device, make sure that:
-- You've activated your Azure Stack Edge Pro device as described in [Connect, set up, and activate Azure Stack Edge Pro](azure-stack-edge-deploy-connect-setup-activate.md).
+- You've activated your Azure Stack Edge Pro FPGA device as described in [Connect, set up, and activate Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-connect-setup-activate.md).
## Configure compute
-To configure compute on your Azure Stack Edge Pro, you'll create an IoT Hub resource.
+To configure compute on your Azure Stack Edge Pro FPGA, you'll create an IoT Hub resource.
1. In the Azure portal of your Azure Stack Edge resource, go to **Overview**. In the right-pane, select **IoT Edge**.
For the simple deployment in this tutorial, you'll need two shares: one Edge sha
## Add a module
-You could add a custom or a pre-built module. There are no custom modules on this Edge device. To learn how to create a custom module, go to [Develop a C# module for your Azure Stack Edge Pro device](azure-stack-edge-create-iot-edge-module.md).
+You could add a custom or a pre-built module. There are no custom modules on this Edge device. To learn how to create a custom module, go to [Develop a C# module for your Azure Stack Edge Pro FPGA device](azure-stack-edge-create-iot-edge-module.md).
-In this section, you add a custom module to the IoT Edge device that you created in [Develop a C# module for your Azure Stack Edge Pro](azure-stack-edge-create-iot-edge-module.md). This custom module takes files from an Edge local share on the Edge device and moves them to an Edge (cloud) share on the device. The cloud share then pushes the files to the Azure storage account that's associated with the cloud share.
+In this section, you add a custom module to the IoT Edge device that you created in [Develop a C# module for your Azure Stack Edge Pro FPGA](azure-stack-edge-create-iot-edge-module.md). This custom module takes files from an Edge local share on the Edge device and moves them to an Edge (cloud) share on the device. The cloud share then pushes the files to the Azure storage account that's associated with the cloud share.
1. Go to **IoT Edge > Modules**. From the device command bar, select **+ Add module**. 2. In the **Configure and add module** blade, input the following values:
In this section, you add a custom module to the IoT Edge device that you created
|Field |Value | |||
- |Name | A unique name for the module. This module is a docker container that you can deploy to the IoT Edge device that's associated with your Azure Stack Edge Pro. |
+ |Name | A unique name for the module. This module is a docker container that you can deploy to the IoT Edge device that's associated with your Azure Stack Edge Pro FPGA. |
|Image URI | The image URI for the corresponding container image for the module. | |Credentials required | If checked, username and password are used to retrieve modules with a matching URL. | |Input share | Select an input share. The Edge local share is the input share in this case. The module used here moves files from the Edge local share to an Edge share where they are uploaded into the cloud. |
In this tutorial, you learned how to:
> * Add a compute module > * Verify data transform and transfer
-To learn how to administer your Azure Stack Edge Pro device, see:
+To learn how to administer your Azure Stack Edge Pro FPGA device, see:
> [!div class="nextstepaction"]
-> [Use local web UI to administer a Azure Stack Edge Pro](azure-stack-edge-manage-access-power-connectivity-mode.md)
+> [Use local web UI to administer a Azure Stack Edge Pro FPGA](azure-stack-edge-manage-access-power-connectivity-mode.md)
databox-online Azure Stack Edge Deploy Connect Setup Activate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-connect-setup-activate.md
Title: Tutorial to connect to, configure, activate Azure Stack Edge Pro device in Azure portal | Microsoft Docs
-description: Tutorial to deploy Azure Stack Edge Pro instructs you to connect, set up, and activate your physical device.
+ Title: Tutorial to connect to, configure, activate Azure Stack Edge Pro FPGA device in Azure portal | Microsoft Docs
+description: Tutorial to deploy Azure Stack Edge Pro FPGA instructs you to connect, set up, and activate your physical device.
Last updated 03/28/2019
-# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro FPGA so I can use it to transfer data to Azure.
-# Tutorial: Connect, set up, and activate Azure Stack Edge Pro
+# Tutorial: Connect, set up, and activate Azure Stack Edge Pro FPGA
-This tutorial describes how you can connect to, set up, and activate your Azure Stack Edge Pro device by using the local web UI.
+This tutorial describes how you can connect to, set up, and activate your Azure Stack Edge Pro FPGA device by using the local web UI.
The setup and activation process can take around 20 minutes to complete.
In this tutorial, you learn how to:
## Prerequisites
-Before you configure and set up your Azure Stack Edge Pro device, make sure that:
+Before you configure and set up your Azure Stack Edge Pro FPGA device, make sure that:
-* You've installed the physical device as detailed in [Install Azure Stack Edge Pro](azure-stack-edge-deploy-install.md).
-* You have the activation key from the Azure Stack Edge service that you created to manage the Azure Stack Edge Pro device. For more information, go to [Prepare to deploy Azure Stack Edge Pro](azure-stack-edge-deploy-prep.md).
+* You've installed the physical device as detailed in [Install Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-install.md).
+* You have the activation key from the Azure Stack Edge service that you created to manage the Azure Stack Edge Pro FPGA device. For more information, go to [Prepare to deploy Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-prep.md).
## Connect to the local web UI setup
-1. Configure the Ethernet adapter on your computer to connect to the Azure Stack Edge Pro device with a static IP address of 192.168.100.5 and subnet 255.255.255.0.
+1. Configure the Ethernet adapter on your computer to connect to the Azure Stack Edge Pro FPGA device with a static IP address of 192.168.100.5 and subnet 255.255.255.0.
2. Connect the computer to PORT 1 on your device. Use the following illustration to identify PORT 1 on your device.
Before you configure and set up your Azure Stack Edge Pro device, make sure that
5. Sign in to the web UI of your device. The default password is *Password1*.
- ![Azure Stack Edge Pro device sign-in page](./media/azure-stack-edge-deploy-connect-setup-activate/image3.png)
+ ![Azure Stack Edge Pro FPGA device sign-in page](./media/azure-stack-edge-deploy-connect-setup-activate/image3.png)
6. At the prompt, change the device administrator password. The new password must contain between 8 and 16 characters. It must contain three of the following characters: uppercase, lowercase, numeric, and special characters.
Your dashboard displays the various settings that are required to configure and
a. In the **Web proxy URL** box, enter the URL in this format: `http://host-IP address or FQDN:Port number`. HTTPS URLs are not supported.
- b. Under **Authentication**, select **None** or **NTLM**. If you enable compute and use IoT Edge module on your Azure Stack Edge Pro device, we recommend you set web proxy authentication to **None**. **NTLM** is not supported.
+ b. Under **Authentication**, select **None** or **NTLM**. If you enable compute and use IoT Edge module on your Azure Stack Edge Pro FPGA device, we recommend you set web proxy authentication to **None**. **NTLM** is not supported.
c. If you're using authentication, enter a username and password.
Your dashboard displays the various settings that are required to configure and
> [!NOTE] > Proxy-auto config (PAC) files are not supported. A PAC file defines how web browsers and other user agents can automatically choose the appropriate proxy server (access method) for fetching a given URL. > Proxies that try to intercept and read all the traffic (then re-sign everything with their own certification) aren't compatible since the proxy's cert is not trusted.
- > Typically transparent proxies work well with Azure Stack Edge Pro.
+ > Typically transparent proxies work well with Azure Stack Edge Pro FPGA.
4. (Optional) In the left pane, select **Time settings**, and then configure the time zone and the primary and secondary NTP servers for your device. NTP servers are required because your device must synchronize time so that it can authenticate with your cloud service providers.
Your dashboard displays the various settings that are required to configure and
6. In the left pane, select **Cloud settings**, and then activate your device with the Azure Stack Edge service in the Azure portal.
- 1. In the **Activation key** box, enter the activation key that you got in [Get the activation key](azure-stack-edge-deploy-prep.md#get-the-activation-key) for Azure Stack Edge Pro.
+ 1. In the **Activation key** box, enter the activation key that you got in [Get the activation key](azure-stack-edge-deploy-prep.md#get-the-activation-key) for Azure Stack Edge Pro FPGA.
2. Select **Apply**. ![Local web UI "Cloud settings" page](./media/azure-stack-edge-deploy-connect-setup-activate/set-up-activate-6.png)
In this tutorial, you learned how to:
> * Connect to a physical device > * Set up and activate the physical device
-To learn how to transfer data with your Azure Stack Edge Pro device, see:
+To learn how to transfer data with your Azure Stack Edge Pro FPGA device, see:
> [!div class="nextstepaction"]
-> [Transfer data with Azure Stack Edge Pro](./azure-stack-edge-deploy-add-shares.md).
+> [Transfer data with Azure Stack Edge Pro FPGA](./azure-stack-edge-deploy-add-shares.md).
databox-online Azure Stack Edge Deploy Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-install.md
Title: Tutorial to install - Unpack, rack, cable Azure Stack Edge Pro physical device | Microsoft Docs
-description: The second tutorial about installing Azure Stack Edge Pro involves how to unpack, rack, and cable the physical device.
+ Title: Tutorial to install - Unpack, rack, cable Azure Stack Edge Pro FPGA physical device | Microsoft Docs
+description: The second tutorial about installing Azure Stack Edge Pro FPGA involves how to unpack, rack, and cable the physical device.
Last updated 01/17/2020
-# Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro in datacenter so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro FPGA in datacenter so I can use it to transfer data to Azure.
-# Tutorial: Install Azure Stack Edge Pro
+# Tutorial: Install Azure Stack Edge Pro FPGA
-This tutorial describes how to install a Azure Stack Edge Pro physical device. The installation procedure involves unpacking, rack mounting, and cabling the device.
+This tutorial describes how to install a Azure Stack Edge Pro FPGA physical device. The installation procedure involves unpacking, rack mounting, and cabling the device.
The installation can take around two hours to complete.
The prerequisites for installing a physical device as follows:
Before you begin, make sure that:
-* You've completed all the steps in [Prepare to deploy Azure Stack Edge Pro](azure-stack-edge-deploy-prep.md).
+* You've completed all the steps in [Prepare to deploy Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-prep.md).
* You've created a Azure Stack Edge resource to deploy your device. * You've generated the activation key to activate your device with the Azure Stack Edge resource.
-### For the Azure Stack Edge Pro physical device
+### For the Azure Stack Edge Pro FPGA physical device
Before you deploy a device:
Before you deploy a device:
Before you begin: -- Review the networking requirements for deploying Azure Stack Edge Pro, and configure the datacenter network per the requirements. For more information, see [Azure Stack Edge Pro networking requirements](azure-stack-edge-system-requirements.md#networking-port-requirements).
+- Review the networking requirements for deploying Azure Stack Edge Pro FPGA, and configure the datacenter network per the requirements. For more information, see [Azure Stack Edge Pro FPGA networking requirements](azure-stack-edge-system-requirements.md#networking-port-requirements).
- Make sure that the minimum Internet bandwidth is 20 Mbps for optimal functioning of the device.
This device is shipped in a single box. Complete the following steps to unpack y
1. Place the box on a flat, level surface. 2. Inspect the box and the packaging foam for crushes, cuts, water damage, or any other obvious damage. If the box or packaging is severely damaged, don't open it. Contact Microsoft Support to help you assess whether the device is in good working order. 3. Unpack the box. After unpacking the box, make sure that you have:
- - One single enclosure Azure Stack Edge Pro device
+ - One single enclosure Azure Stack Edge Pro FPGA device
- Two power cords - One rail kit assembly - A Safety, Environmental, and Regulatory Information booklet
-If you didn't receive all of the items listed here, contact Azure Stack Edge Pro support. The next step is to rack mount your device.
+If you didn't receive all of the items listed here, contact Azure Stack Edge Pro FPGA support. The next step is to rack mount your device.
## Rack the device
If you didn't receive all of the items listed here, contact Azure Stack Edge Pro
The device must be installed on a standard 19-inch rack. Use the following procedure to rack mount your device on a standard 19-inch rack. > [!IMPORTANT]
-> Azure Stack Edge Pro devices must be rack-mounted for proper operation.
+> Azure Stack Edge Pro FPGA devices must be rack-mounted for proper operation.
### Prerequisites
Locate the components for installing the rail kit assembly:
## Cable the device
-Route the cables and then cable your device. The following procedures explain how to cable your Azure Stack Edge Pro device for power and network.
+Route the cables and then cable your device. The following procedures explain how to cable your Azure Stack Edge Pro FPGA device for power and network.
Before you start cabling your device, you need the following: -- Your Azure Stack Edge Pro physical device, unpacked, and rack mounted.
+- Your Azure Stack Edge Pro FPGA physical device, unpacked, and rack mounted.
- Two power cables. - At least one 1-GbE RJ-45 network cable to connect to the management interface. There are two 1-GbE network interfaces, one management and one data, on the device. - One 25-GbE SFP+ copper cable for each data network interface to be configured. At least one data network interface from among PORT 2, PORT 3, PORT 4, PORT 5, or PORT 6 needs to be connected to the Internet (with connectivity to Azure).
Before you start cabling your device, you need the following:
> [!NOTE] > - If you are connecting only one data network interface, we recommend that you use a 25/10-GbE network interface such as PORT 3, PORT 4, PORT 5, or PORT 6 to send data to Azure. > - For best performance and to handle large volumes of data, consider connecting all the data ports.
-> - The Azure Stack Edge Pro device should be connected to the datacenter network so that it can ingest data from data source servers.
+> - The Azure Stack Edge Pro FPGA device should be connected to the datacenter network so that it can ingest data from data source servers.
-On your Azure Stack Edge Pro device:
+On your Azure Stack Edge Pro FPGA device:
- The front panel has disk drives and a power button.
Take the following steps to cable your device for power and network.
## Next steps
-In this tutorial, you learned about Azure Stack Edge Pro topics such as how to:
+In this tutorial, you learned about Azure Stack Edge Pro FPGA topics such as how to:
> [!div class="checklist"] > * Unpack the device
In this tutorial, you learned about Azure Stack Edge Pro topics such as how to:
Advance to the next tutorial to learn how to connect, set up, and activate your device. > [!div class="nextstepaction"]
-> [Connect and set up Azure Stack Edge Pro](./azure-stack-edge-deploy-connect-setup-activate.md)
+> [Connect and set up Azure Stack Edge Pro FPGA](./azure-stack-edge-deploy-connect-setup-activate.md)
databox-online Azure Stack Edge Deploy Prep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-prep.md
Title: Tutorial to prepare Azure portal, datacenter environment to deploy Azure Stack Edge Pro | Microsoft Docs
-description: The first tutorial about deploying Azure Stack Edge Pro involves preparing the Azure portal.
+ Title: Tutorial to prepare Azure portal, datacenter environment to deploy Azure Stack Edge Pro FPGA | Microsoft Docs
+description: The first tutorial about deploying Azure Stack Edge Pro FPGA involves preparing the Azure portal.
Last updated 03/16/2021
-# Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro FPGA so I can use it to transfer data to Azure.
-# Tutorial: Prepare to deploy Azure Stack Edge Pro
+# Tutorial: Prepare to deploy Azure Stack Edge Pro FPGA
-This is the first tutorial in the series of deployment tutorials that are required to completely deploy Azure Stack Edge Pro. This tutorial describes how to prepare the Azure portal to deploy an Azure Stack Edge resource.
+This is the first tutorial in the series of deployment tutorials that are required to completely deploy Azure Stack Edge Pro FPGA. This tutorial describes how to prepare the Azure portal to deploy an Azure Stack Edge resource.
You need administrator privileges to complete the setup and configuration process. The portal preparation takes less than 10 minutes.
If you don't have an Azure subscription, create a [free account](https://azure.m
## Get started
-To deploy Azure Stack Edge Pro, refer to the following tutorials in the prescribed sequence.
+To deploy Azure Stack Edge Pro FPGA, refer to the following tutorials in the prescribed sequence.
| **#** | **In this step** | **Use these documents** | | | | |
-| 1. |**[Prepare the Azure portal for Azure Stack Edge Pro](azure-stack-edge-deploy-prep.md)** |Create and configure your Azure Stack Edge resource before you install an Azure Stack Box Edge physical device. |
-| 2. |**[Install Azure Stack Edge Pro](azure-stack-edge-deploy-install.md)**|Unpack, rack, and cable the Azure Stack Edge Pro physical device. |
-| 3. |**[Connect, set up, and activate Azure Stack Edge Pro](azure-stack-edge-deploy-connect-setup-activate.md)** |Connect to the local web UI, complete the device setup, and activate the device. The device is ready to set up SMB or NFS shares. |
-| 4. |**[Transfer data with Azure Stack Edge Pro](azure-stack-edge-deploy-add-shares.md)** |Add shares and connect to shares via SMB or NFS. |
-| 5. |**[Transform data with Azure Stack Edge Pro](azure-stack-edge-deploy-configure-compute.md)** |Configure compute modules on the device to transform the data as it moves to Azure. |
+| 1. |**[Prepare the Azure portal for Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-prep.md)** |Create and configure your Azure Stack Edge resource before you install an Azure Stack Box Edge physical device. |
+| 2. |**[Install Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-install.md)**|Unpack, rack, and cable the Azure Stack Edge Pro FPGA physical device. |
+| 3. |**[Connect, set up, and activate Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-connect-setup-activate.md)** |Connect to the local web UI, complete the device setup, and activate the device. The device is ready to set up SMB or NFS shares. |
+| 4. |**[Transfer data with Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-add-shares.md)** |Add shares and connect to shares via SMB or NFS. |
+| 5. |**[Transform data with Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-configure-compute.md)** |Configure compute modules on the device to transform the data as it moves to Azure. |
You can now begin to set up the Azure portal. ## Prerequisites
-Following are the configuration prerequisites for your Azure Stack Edge resource, your Azure Stack Edge Pro device, and the datacenter network.
+Following are the configuration prerequisites for your Azure Stack Edge resource, your Azure Stack Edge Pro FPGA device, and the datacenter network.
### For the Azure Stack Edge resource
Before you begin, make sure that:
* You have your Microsoft Azure storage account with access credentials. * You are not blocked by any Azure policy set up by your system administrator. For more information about policies, see [Quickstart: Create a policy assignment to identify non-compliant resources](../governance/policy/assign-policy-portal.md).
-### For the Azure Stack Edge Pro device
+### For the Azure Stack Edge Pro FPGA device
Before you deploy a physical device, make sure that:
Before you deploy a physical device, make sure that:
Before you begin, make sure that:
-* The network in your datacenter is configured per the networking requirements for your Azure Stack Edge Pro device. For more information, see [Azure Stack Edge Pro System Requirements](azure-stack-edge-system-requirements.md).
+* The network in your datacenter is configured per the networking requirements for your Azure Stack Edge Pro FPGA device. For more information, see [Azure Stack Edge Pro FPGA System Requirements](azure-stack-edge-system-requirements.md).
-* For normal operating conditions of your Azure Stack Edge Pro, you have:
+* For normal operating conditions of your Azure Stack Edge Pro FPGA, you have:
* A minimum of 10 Mbps download bandwidth to ensure the device stays updated. * A minimum of 20 Mbps dedicated upload and download bandwidth to transfer files. ## Create new resource for existing device
-If you're an existing Azure Stack Edge Pro customer, use the following procedure to create a new resource if you need to replace or reset your existing device.
+If you're an existing Azure Stack Edge Pro FPGA customer, use the following procedure to create a new resource if you need to replace or reset your existing device.
If you're a new customer, we recommend that you explore using Azure Stack Edge Pro - GPU devices for your workloads. For more information, go to [What is Azure Stack Edge Pro with GPU](azure-stack-edge-gpu-overview.md). For information about ordering an Azure Stack Edge Pro with GPU device, go to [Create a new resource for Azure Stack Edge Pro - GPU](azure-stack-edge-gpu-deploy-prep.md?tabs=azure-portal#create-a-new-resource).
-To create a new Azure Stack Edge Pro resource for an existing device, take the following steps in the Azure portal.
+To create a new Azure Stack Edge resource for an existing device, take the following steps in the Azure portal.
1. Use your Microsoft Azure credentials to sign in to:
To create a new Azure Stack Edge Pro resource for an existing device, take the f
1. Select **+ Create a resource**. Search for and select **Azure Stack Edge**. Then select **Create**.
-1. Select the subscription for the Azure Stack Edge Pro device and the country to ship the device to in **Ship to**.
+1. Select the subscription for the Azure Stack Edge Pro FPGA device and the country to ship the device to in **Ship to**.
![Select the subscription and ship-to country for your device](media/azure-stack-edge-deploy-prep/create-fpga-existing-resource-01.png)
To create a new Azure Stack Edge Pro resource for an existing device, take the f
After the order is placed, Microsoft reviews the order and contacts you (via email) with shipping details.
-![Notification for review of the Azure Stack Edge Pro order](media/azure-stack-edge-deploy-prep/data-box-edge-resource-02.png)
+![Notification for review of the Azure Stack Edge Pro FPGA order](media/azure-stack-edge-deploy-prep/data-box-edge-resource-02.png)
## Get the activation key
-After the Azure Stack Edge resource is up and running, you'll need to get the activation key. This key is used to activate and connect your Azure Stack Edge Pro device with the resource. You can get this key now while you are in the Azure portal.
+After the Azure Stack Edge resource is up and running, you'll need to get the activation key. This key is used to activate and connect your Azure Stack Edge Pro FPGA device with the resource. You can get this key now while you are in the Azure portal.
1. Go to the resource that you created, and select **Overview**. You'll see a notification to the effect that your order is being processed.
After the Azure Stack Edge resource is up and running, you'll need to get the ac
## Next steps
-In this tutorial, you learned about Azure Stack Edge Pro topics such as:
+In this tutorial, you learned about Azure Stack Edge Pro FPGA topics such as:
> [!div class="checklist"] > > * Create a new resource > * Get the activation key
-Advance to the next tutorial to learn how to install Azure Stack Edge Pro.
+Advance to the next tutorial to learn how to install Azure Stack Edge Pro FPGA.
> [!div class="nextstepaction"]
-> [Install Azure Stack Edge Pro](./azure-stack-edge-deploy-install.md)
+> [Install Azure Stack Edge Pro FPGA](./azure-stack-edge-deploy-install.md)
databox-online Azure Stack Edge Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-limits.md
Title: Azure Stack Edge Pro limits | Microsoft Docs
-description: Learn about limits and recommended sizes as you deploy and operate Azure Stack Edge Pro, including service limits, device limits, and storage limits.
+ Title: Azure Stack Edge Pro FPGA limits | Microsoft Docs
+description: Learn about limits and recommended sizes as you deploy and operate Azure Stack Edge Pro FPGA, including service limits, device limits, and storage limits.
Last updated 10/12/2020
-# Azure Stack Edge Pro limits
+# Azure Stack Edge Pro FPGA limits
-Consider these limits as you deploy and operate your Microsoft Azure Stack Edge Pro solution.
+Consider these limits as you deploy and operate your Microsoft Azure Stack Edge Pro FPGA solution.
## Azure Stack Edge service limits
Consider these limits as you deploy and operate your Microsoft Azure Stack Edge
## Azure Stack Edge device limits
-The following table describes the limits for the Azure Stack Edge Pro device.
+The following table describes the limits for the Azure Stack Edge Pro FPGA device.
The following table describes the limits for the Azure Stack Edge device.
The following table describes the limits for the Azure Stack Edge device.
## Next steps -- [Prepare to deploy Azure Stack Edge Pro](azure-stack-edge-deploy-prep.md)
+- [Prepare to deploy Azure Stack Edge Pro FPGA](azure-stack-edge-deploy-prep.md)
databox-online Azure Stack Edge Manage Access Power Connectivity Mode https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-manage-access-power-connectivity-mode.md
Title: Azure Stack Edge Pro device access, power, and connectivity mode | Microsoft Docs
-description: Describes how to manage access, power, and connectivity mode for the Azure Stack Edge Pro device that helps transfer data to Azure
+ Title: Azure Stack Edge Pro FPGA device access, power, and connectivity mode | Microsoft Docs
+description: Describes how to manage access, power, and connectivity mode for the Azure Stack Edge Pro FPGA device that helps transfer data to Azure
Last updated 06/24/2019
-# Manage access, power, and connectivity mode for your Azure Stack Edge Pro
+# Manage access, power, and connectivity mode for your Azure Stack Edge Pro FPGA
-This article describes how to manage the access, power, and connectivity mode for your Azure Stack Edge Pro. These operations are performed via the local web UI or the Azure portal.
+This article describes how to manage the access, power, and connectivity mode for your Azure Stack Edge Pro FPGA. These operations are performed via the local web UI or the Azure portal.
In this article, you learn how to:
In this article, you learn how to:
## Manage device access
-The access to your Azure Stack Edge Pro device is controlled by the use of a device password. You can change the password via the local web UI. You can also reset the device password in the Azure portal.
+The access to your Azure Stack Edge Pro FPGA device is controlled by the use of a device password. You can change the password via the local web UI. You can also reset the device password in the Azure portal.
### Change device password
To create your Azure Stack Edge / Data Box Gateway, IoT Hub, and Azure Storage r
### Manage Microsoft Graph API permissions
-When generating the activation key for the Azure Stack Edge Pro device, or performing any operations that require credentials, you need permissions to Azure Active Directory Graph API. The operations that need credentials could be:
+When generating the activation key for the Azure Stack Edge Pro FPGA device, or performing any operations that require credentials, you need permissions to Azure Active Directory Graph API. The operations that need credentials could be:
- Creating a share with an associated storage account. - Creating a user who can access the shares on the device.
-You should have a `User` access on Active Directory tenant as you need to be able to `Read all directory objects`. You can't be a Guest user as they don't have permissions to `Read all directory objects`. If you're a guest, then the operations such as generation of an activation key, creation of a share on your Azure Stack Edge Pro device, creation of a user, configuration of Edge compute role, reset device password will all fail.
+You should have a `User` access on Active Directory tenant as you need to be able to `Read all directory objects`. You can't be a Guest user as they don't have permissions to `Read all directory objects`. If you're a guest, then the operations such as generation of an activation key, creation of a share on your Azure Stack Edge Pro FPGA device, creation of a user, configuration of Edge compute role, reset device password will all fail.
For more information on how to provide access to users to Microsoft Graph API, see [Microsoft Graph permissions reference](/graph/permissions-reference).
To get a list of registered resource providers in the current subscription, run
Get-AzResourceProvider -ListAvailable |where {$_.Registrationstate -eq "Registered"} ```
-For Azure Stack Edge Pro device, `Microsoft.DataBoxEdge` should be registered. To register `Microsoft.DataBoxEdge`, subscription admin should run the following command:
+For Azure Stack Edge Pro FPGA device, `Microsoft.DataBoxEdge` should be registered. To register `Microsoft.DataBoxEdge`, subscription admin should run the following command:
```PowerShell Register-AzResourceProvider -ProviderNamespace Microsoft.DataBoxEdge
databox-online Azure Stack Edge Manage Bandwidth Schedules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-manage-bandwidth-schedules.md
Title: Azure Stack Edge Pro manage bandwidth schedules | Microsoft Docs
-description: Describes how to use the Azure portal to manage bandwidth schedules on your Azure Stack Edge Pro.
+ Title: Azure Stack Edge Pro FPGA manage bandwidth schedules | Microsoft Docs
+description: Describes how to use the Azure portal to manage bandwidth schedules on your Azure Stack Edge Pro FPGA.
Last updated 03/22/2019
-# Use the Azure portal to manage bandwidth schedules on your Azure Stack Edge Pro
+# Use the Azure portal to manage bandwidth schedules on your Azure Stack Edge Pro FPGA
-This article describes how to manage users on your Azure Stack Edge Pro. Bandwidth schedules allow you to configure network bandwidth usage across multiple time-of-day schedules. These schedules can be applied to the upload and download operations from your device to the cloud.
+This article describes how to manage users on your Azure Stack Edge Pro FPGA. Bandwidth schedules allow you to configure network bandwidth usage across multiple time-of-day schedules. These schedules can be applied to the upload and download operations from your device to the cloud.
-You can add, modify, or delete the bandwidth schedules for your Azure Stack Edge Pro via the Azure portal.
+You can add, modify, or delete the bandwidth schedules for your Azure Stack Edge Pro FPGA via the Azure portal.
In this article, you learn how to:
Do the following steps to edit a bandwidth schedule.
## Delete a schedule
-Do the following steps to delete a bandwidth schedule associated with your Azure Stack Edge Pro device.
+Do the following steps to delete a bandwidth schedule associated with your Azure Stack Edge Pro FPGA device.
1. In the Azure portal, go to your Azure Stack Edge resource and then go to **Bandwidth**.
databox-online Azure Stack Edge Manage Compute https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-manage-compute.md
Title: Azure Stack Edge Pro compute management | Microsoft Docs
-description: Describes how to manage the Edge compute settings such as trigger, modules, view compute configuration, remove configuration via the Azure portal on your Azure Stack Edge Pro.
+ Title: Azure Stack Edge Pro FPGA compute management | Microsoft Docs
+description: Describes how to manage the Edge compute settings such as trigger, modules, view compute configuration, remove configuration via the Azure portal on your Azure Stack Edge Pro FPGA.
Last updated 01/06/2021
-# Manage compute on your Azure Stack Edge Pro
+# Manage compute on your Azure Stack Edge Pro FPGA
-This article describes how to manage compute on your Azure Stack Edge Pro. You can manage the compute via the Azure portal or via the local web UI. Use the Azure portal to manage modules, triggers, and compute configuration, and the local web UI to manage compute settings.
+This article describes how to manage compute on your Azure Stack Edge Pro FPGA. You can manage the compute via the Azure portal or via the local web UI. Use the Azure portal to manage modules, triggers, and compute configuration, and the local web UI to manage compute settings.
In this article, you learn how to:
In this article, you learn how to:
## Manage triggers
-Events are things that happen within your cloud environment or on your device that you might want to take action on. For example, when a file is created in a share, it is an event. Triggers raise the events. For your Azure Stack Edge Pro, triggers can be in response to file events or a schedule.
+Events are things that happen within your cloud environment or on your device that you might want to take action on. For example, when a file is created in a share, it is an event. Triggers raise the events. For your Azure Stack Edge Pro FPGA, triggers can be in response to file events or a schedule.
- **File**: These triggers are in response to file events such as creation of a file, modification of a file. - **Scheduled**: These triggers are in response to a schedule that you can define with a start date, start time, and the repeat interval.
The list of triggers updates to reflect the deletion.
## Manage compute configuration
-Use the Azure portal to view the compute configuration, remove an existing compute configuration, or to refresh the compute configuration to sync up access keys for the IoT device and IoT Edge device for your Azure Stack Edge Pro.
+Use the Azure portal to view the compute configuration, remove an existing compute configuration, or to refresh the compute configuration to sync up access keys for the IoT device and IoT Edge device for your Azure Stack Edge Pro FPGA.
### View compute configuration
Take the following steps in the Azure portal to remove the existing Edge compute
### Sync up IoT device and IoT Edge device access keys
-When you configure compute on your Azure Stack Edge Pro, an IoT device and an IoT Edge device are created. These devices are automatically assigned symmetric access keys. As a security best practice, these keys are rotated regularly via the IoT Hub service.
+When you configure compute on your Azure Stack Edge Pro FPGA, an IoT device and an IoT Edge device are created. These devices are automatically assigned symmetric access keys. As a security best practice, these keys are rotated regularly via the IoT Hub service.
To rotate these keys, you can go to the IoT Hub service that you created and select the IoT device or the IoT Edge device. Each device has a primary access key and a secondary access keys. Assign the primary access key to the secondary access key and then regenerate