Updates from: 03/31/2021 03:11:56
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Azure Ad External Identities Videos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/azure-ad-external-identities-videos.md
description: Microsoft Azure Active Directory B2C Video Series
-
Learn the basics of External Identities - Azure Active Directory B2C (Azure AD B
Get a deeper view into the features and technical aspects of the Azure AD B2C service.
-| Video title | | Video title | |
+| Video title | Video | Video title | Video |
|:|:|:|:| |[Azure AD B2C sign-up sign-in](https://www.youtube.com/watch?v=c8rN1ZaR7wk&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=6&t=2s) 10:25 | [:::image type="icon" source="./media/external-identities-videos/customer-sign-up-sign-in.png" border="false":::](https://www.youtube.com/watch?v=c8rN1ZaR7wk&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=6) | [Azure AD B2C single sign on and self service password reset](https://www.youtube.com/watch?v=kRV-7PSLK38&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=7) 8:40 | [:::image type="icon" source="./media/external-identities-videos/single-sign-on.png" border="false":::](https://www.youtube.com/watch?v=kRV-7PSLK38&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=7) | | [Application and identity migration to Azure AD B2C](https://www.youtube.com/watch?v=Xw_YwSJmhIQ&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=9) 10:34 | [:::image type="icon" source="./media/external-identities-videos/identity-migration-aad-b2c.png" border="false":::](https://www.youtube.com/watch?v=Xw_YwSJmhIQ&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=9) | [Build resilient and scalable flows using Azure AD B2C](https://www.youtube.com/watch?v=8f_Ozpw9yTs&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=12) 16:47 | [:::image type="icon" source="./media/external-identities-videos/b2c-scalable-flows.png" border="false":::](https://www.youtube.com/watch?v=8f_Ozpw9yTs&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=12) |
Get a deeper view into the features and technical aspects of the Azure AD B2C se
Learn how to perform various use cases in Azure AD B2C.
-| Video title | | Video title | |
+| Video title | Video | Video title | Video |
|:|:|:|:| |[Azure AD: Monitoring and reporting Azure AD B2C using Azure Monitor](https://www.youtube.com/watch?v=Mu9GQy-CbXI&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=1) 6:57 | [:::image type="icon" source="./media/external-identities-videos/monitoring-reporting-aad-b2c.png" border="false":::](https://www.youtube.com/watch?v=Mu9GQy-CbXI&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=1) | [Azure AD B2C user migration using Microsoft Graph API](https://www.youtube.com/watch?v=9BRXBtkBzL4&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=5) 7:09 | [:::image type="icon" source="./media/external-identities-videos/user-migration-msgraph-api.png" border="false":::](https://www.youtube.com/watch?v=9BRXBtkBzL4list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=5) | | [Azure AD B2C user migration strategies](https://www.youtube.com/watch?v=lCWR6PGUgz0&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=2) 8:22 | [:::image type="icon" source="./media/external-identities-videos/user-migration-stratagies.png" border="false":::](https://www.youtube.com/watch?v=lCWR6PGUgz0&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=2) | [How to localize or customize language using Azure AD B2C](https://www.youtube.com/watch?v=yqrX5_tA7Ms&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=13) 20:41 | [:::image type="icon" source="./media/external-identities-videos/language-localization.png" border="false":::](https://www.youtube.com/watch?v=yqrX5_tA7Ms&list=PL3ZTgFEc7LyuJ8YRSGXBUVItCPnQz3YX0&index=13) |
active-directory-domain-services Concepts Replica Sets https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/concepts-replica-sets.md
Previously updated : 02/26/2021 Last updated : 03/30/2021
The following example shows a managed domain with three replica sets to further
The default SKU for a managed domain is the *Enterprise* SKU, which supports multiple replica sets. To create additional replica sets if you changed to the *Standard* SKU, [upgrade the managed domain](change-sku.md) to *Enterprise* or *Premium*.
-The supported maximum number of replica sets is four, including the first replica created when you created the managed domain.
+The supported maximum number of replica sets is five, including the first replica created when you created the managed domain.
Billing for each replica set is based on the domain configuration SKU. For example, if you have a managed domain that uses the *Enterprise* SKU and you have three replica sets, your subscription is billed per hour for each of the three replica sets.
No. Replica sets must be in the same subscription as the managed domain.
### How many replica sets can I create?
-You can create a maximum of four replica setsΓÇöthe initial replica set for the managed domain, plus three additional replica sets.
+You can create a maximum of five replica setsΓÇöthe initial replica set for the managed domain, plus four additional replica sets.
### How does user and group information get synchronized to my replica sets?
active-directory Plan Auto User Provisioning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/plan-auto-user-provisioning.md
Last updated 12/31/2020
-#customer intent: As an admin, I want to automate user provisioning to SaaS apps
+# Customer intent: As an administrator, I want to automate user provisioning to SaaS apps.
# Plan an automatic user provisioning deployment
active-directory Howto Authentication Passwordless Security Key On Premises https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-authentication-passwordless-security-key-on-premises.md
If clean installing a hybrid Azure AD joined machine, after the domain join and
Make sure enough domain controllers are patched to respond in time to service your resource request. To check if you can see a domain controller that is running the feature, review the output of `nltest /dsgetdc:contoso /keylist /kdc`.
+Note: This /Keylist switch in nltest command can be found from client windows 10 v2004 and above
+ ## Next steps [Learn more about passwordless](concept-authentication-passwordless.md)
active-directory Howto Authentication Temporary Access Pass https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-authentication-temporary-access-pass.md
Previously updated : 03/18/2021 Last updated : 03/29/2021
To configure the Temporary Access Pass authentication method policy:
| One-time use | False | True / False | When the policy is set to false, passes in the tenant can be used either once or more than once during its validity (maximum lifetime). By enforcing one-time use in the Temporary Access Pass policy, all passes created in the tenant will be created as one-time use. | | Length | 8 | 8-48 characters | Defines the length of the passcode. |
-## Create a Temporary Access Pass in the Azure AD Portal
+## Create a Temporary Access Pass
After you enable a policy, you can create a Temporary Access Pass for a user in Azure AD. These roles can perform the following actions related to a Temporary Access Pass.
These roles can perform the following actions related to a Temporary Access Pass
- Authentication administrators can create, delete, view a Temporary Access Pass on members (except themselves) - Global Administrator can view the Temporary Access Pass details on the user (without reading the code itself).
-To create a Temporary Access Pass:
-
-1. Sign in to the portal as either a Global administrator, Privileged Authentication administrator, or Authentication administrator.
+1. Sign in to the Azure portal as either a Global administrator, Privileged Authentication administrator, or Authentication administrator.
1. Click **Azure Active Directory**, browse to Users, select a user, such as *Chris Green*, then choose **Authentication methods**. 1. If needed, select the option to **Try the new user authentication methods experience**. 1. Select the option to **Add authentication methods**.
To create a Temporary Access Pass:
![Screenshot of Temporary Access Pass details](./media/how-to-authentication-temporary-access-pass/details.png)
+The following commands show how to create and get a Temporary Access Pass by using PowerShell:
+
+```powershell
+# Create a Temporary Access Pass for a user
+$properties = @{}
+$properties.isUsableOnce = $True
+$properties.startDateTime = '2021-03-11 06:00:00'
+$propertiesJSON = $properties | ConvertTo-Json
+
+New-MgUserAuthenticationTemporaryAccessPassMethod -UserId user2@contoso.com -BodyParameter $propertiesJSON
+
+Id CreatedDateTime IsUsable IsUsableOnce LifetimeInMinutes MethodUsabilityReason StartDateTime TemporaryAccessPass
+-- -- -- - -
+c5dbd20a-8b8f-4791-a23f-488fcbde3b38 9/03/2021 11:19:17 PM False True 60 NotYetValid 11/03/2021 6:00:00 AM TAPRocks!
+
+# Get a user's Temporary Access Pass
+Get-MgUserAuthenticationTemporaryAccessPassMethod -UserId user3@contoso.com
+
+Id CreatedDateTime IsUsable IsUsableOnce LifetimeInMinutes MethodUsabilityReason StartDateTime TemporaryAccessPass
+-- -- -- - -
+c5dbd20a-8b8f-4791-a23f-488fcbde3b38 9/03/2021 11:19:17 PM False True 60 NotYetValid 11/03/2021 6:00:00 AM
+
+```
+ ## Use a Temporary Access Pass The most common use for a Temporary Access Pass is for a user to register authentication details during the first sign-in, without the need to complete additional security prompts. Authentication methods are registered at [https://aka.ms/mysecurityinfo](https://aka.ms/mysecurityinfo). Users can also update existing authentication methods here.
An expired Temporary Access Pass canΓÇÖt be used. Under the **Authentication met
1. In the Azure AD portal, browse to **Users**, select a user, such as *Tap User*, then choose **Authentication methods**. 1. On the right-hand side of the **Temporary Access Pass (Preview)** authentication method shown in the list, select **Delete**.
+You can also use PowerShell:
+
+```powershell
+# Remove a user's Temporary Access Pass
+Remove-MgUserAuthenticationTemporaryAccessPassMethod -UserId user3@contoso.com -TemporaryAccessPassAuthenticationMethodId c5dbd20a-8b8f-4791-a23f-488fcbde3b38
+```
+ ## Replace a Temporary Access Pass - A user can only have one Temporary Access Pass. The passcode can be used during the start and end time of the Temporary Access Pass.
Keep these limitations in mind:
- When using a one-time Temporary Access Pass to register a Passwordless method such as FIDO2 or Phone sign-in, the user must complete the registration within 10 minutes of sign-in with the one-time Temporary Access Pass. This limitation does not apply to a Temporary Access Pass that can be used more than once. - Guest users can't sign in with a Temporary Access Pass. - Users in scope for Self Service Password Reset (SSPR) registration policy will be required to register one of the SSPR methods after they have signed in with a Temporary Access Pass. If the user is only going to use FIDO2 key, exclude them from the SSPR policy or disable the SSPR registration policy. -- A Temporary Access Pass cannot be used with the Network Policy Server (NPS) extension and Active Directory Federation Services (AD FS) adapter.
+- A Temporary Access Pass cannot be used with the Network Policy Server (NPS) extension and Active Directory Federation Services (AD FS) adapter, or during Windows Setup/Out-of-Box-Experience (OOBE) and AutoPilot.
- When Seamless SSO is enabled on the tenant, the users are prompted to enter a password. The **Use your Temporary Access Pass instead** link will be available for the user to sign-in with a Temporary Access Pass. ![Screenshot of Use a Temporary Access Pass instead](./media/how-to-authentication-temporary-access-pass/alternative.png)
active-directory Concept Continuous Access Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/concept-continuous-access-evaluation.md
If this scenario exists in your environment to avoid infinite loops, Azure AD wi
For an explanation of the office update channels, see [Overview of update channels for Microsoft 365 Apps](/deployoffice/overview-update-channels). It is recommended that organizations do not disable Web Account Manager (WAM).
-### Policy change timing
+### Group membership and Policy update effective time
-Policy changes made by administrators could take up to one day to be effective. Some optimization has been done to reduce the delay to two hours. However, it does not cover all the scenarios yet.
+Group membership and policy update made by administrators could take up to one day to be effective. Some optimization has been done for policy updates which reduce the delay to two hours. However, it does not cover all the scenarios yet.
-If there is an emergency and you need to have your updated policies to be applied to certain users immediately, you should use this [PowerShell command](/powershell/module/azuread/revoke-azureaduserallrefreshtoken) or "Revoke Session" in the user profile page to revoke the users' session, which will make sure that the updated policies will be applied immediately.
+If there is an emergency and you need to have your policies updated or group membership change to be applied to certain users immediately, you should use this [PowerShell command](/powershell/module/azuread/revoke-azureaduserallrefreshtoken) or "Revoke Session" in the user profile page to revoke the users' session, which will make sure that the updated policies will be applied immediately.
### Coauthoring in Office apps
active-directory Howto Conditional Access Policy Registration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/conditional-access/howto-conditional-access-policy-registration.md
Previously updated : 03/24/2021 Last updated : 03/29/2021
The following policy applies to the selected users, who attempt to register usin
1. Set **Enable policy** to **On**. 1. Then select **Create**.
-Administrators will now have to issue Temporary Access Pass credentials to new users so they can satisfy the requirements for multi-factor authentication to register. Steps to accomplish this task, are found in the section [Create a Temporary Access Pass in the Azure AD Portal](../authentication/howto-authentication-temporary-access-pass.md#create-a-temporary-access-pass-in-the-azure-ad-portal).
+Administrators will now have to issue Temporary Access Pass credentials to new users so they can satisfy the requirements for multi-factor authentication to register. Steps to accomplish this task, are found in the section [Create a Temporary Access Pass in the Azure AD Portal](../authentication/howto-authentication-temporary-access-pass.md#create-a-temporary-access-pass).
Organizations may choose to require other grant controls in addition to or in place of **Require multi-factor authentication** at step 6b. When selecting multiple controls be sure to select the appropriate radio button toggle to require **all** or **one** of the selected controls when making this change.
active-directory Active Directory Claims Mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-claims-mapping.md
Based on the method chosen, a set of inputs and outputs is expected. Define the
| ExtractMailPrefix | None | | Join | The suffix being joined must be a verified domain of the resource tenant. |
-### Custom signing key
-
-A custom signing key must be assigned to the service principal object for a claims mapping policy to take effect. This ensures acknowledgment that tokens have been modified by the creator of the claims mapping policy and protects applications from claims mapping policies created by malicious actors. In order to add a custom signing key, you can use the Azure PowerShell cmdlet [`New-AzureADApplicationKeyCredential`](/powerShell/module/Azuread/New-AzureADApplicationKeyCredential) to create a certificate key credential for your Application object.
-
-Apps that have claims mapping enabled must validate their token signing keys by appending `appid={client_id}` to their [OpenID Connect metadata requests](v2-protocols-oidc.md#fetch-the-openid-connect-metadata-document). Below is the format of the OpenID Connect metadata document you should use:
-
-```
-https://login.microsoftonline.com/{tenant}/v2.0/.well-known/openid-configuration?appid={client-id}
-```
- ### Cross-tenant scenarios Claims mapping policies do not apply to guest users. If a guest user tries to access an application with a claims mapping policy assigned to its service principal, the default token is issued (the policy has no effect).
In this example, you create a policy that emits a custom claim "JoinedData" to J
Add-AzureADServicePrincipalPolicy -Id <ObjectId of the ServicePrincipal> -RefObjectId <ObjectId of the Policy> ```
+## Security considerations
+
+Applications that receive tokens rely on the fact that the claim values are authoritatively issued by Azure AD and cannot be tampered with. However, when you modify the token contents via claims mapping policies, these assumptions may no longer be correct. Applications must explicitly acknowledge that tokens have been modified by the creator of the claims mapping policy to protect themselves from claims mapping policies created by malicious actors. This can be done in the following ways:
+
+- Configure a custom signing key
+- Update the application manifest to accept mapped claims.
+
+Without this, Azure AD will return an [`AADSTS50146` error code](reference-aadsts-error-codes.md#aadsts-error-codes).
+
+### Custom signing key
+
+In order to add a custom signing key to the service principal object, you can use the Azure PowerShell cmdlet [`New-AzureADApplicationKeyCredential`](/powerShell/module/Azuread/New-AzureADApplicationKeyCredential) to create a certificate key credential for your Application object.
+
+Apps that have claims mapping enabled must validate their token signing keys by appending `appid={client_id}` to their [OpenID Connect metadata requests](v2-protocols-oidc.md#fetch-the-openid-connect-metadata-document). Below is the format of the OpenID Connect metadata document you should use:
+
+```
+https://login.microsoftonline.com/{tenant}/v2.0/.well-known/openid-configuration?appid={client-id}
+```
+
+### Update the application manifest
+
+Alternatively, you can set the `acceptMappedClaims` property to `true` in the [application manifest](reference-app-manifest.md). As documented on the [apiApplication resource type](/graph/api/resources/apiapplication#properties), this allows an application to use claims mapping without specifying a custom signing key.
+
+This does require the requested token audience to use a verified domain name of your Azure AD tenant, which means you should ensure to set the `Application ID URI` (represented by the `identifierUris` in the application manifest) for example to `https://contoso.com/my-api` or (simply using the default tenant name) `https://contoso.onmicrosoft.com/my-api`.
+
+If you're not using a verified domain, Azure AD will return an `AADSTS501461` error code with message *"AcceptMappedClaims is only supported for a token audience matching the application GUID or an audience within the tenant's verified domains. Either change the resource identifier, or use an application-specific signing key."*
+ ## See also - To learn how to customize claims issued in the SAML token through the Azure portal, see [How to: Customize claims issued in the SAML token for enterprise applications](active-directory-saml-claims-customization.md)
active-directory Msal B2c Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-b2c-overview.md
Last updated 06/05/2020
-# Customer intent: As an application developer, I want to learn how MSAL.js can be used with Azure AD B2C for
-# authentication and authorization in my organization's web apps and web APIs that my customers log in to and use.
+# Customer intent: As an application developer, I want to learn how MSAL.js can be used with Azure AD B2C for authentication and authorization in my organization's web apps and web APIs that my customers log in to and use.
# Use the Microsoft Authentication Library for JavaScript to work with Azure AD B2C
active-directory Msal Js Initializing Client Applications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-js-initializing-client-applications.md
Last updated 07/17/2020
-# Customer intent: As an application developer, I want to learn about initializing a client application in MSAL.js to
-# enable support for authentication and authorization in a JavaScript single-page application (SPA).
+# Customer intent: As an application developer, I want to learn about initializing a client application in MSAL.js to enable support for authentication and authorization in a JavaScript single-page application (SPA).
# Initialize client applications using MSAL.js
active-directory Msal Net Aad B2c Considerations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-aad-b2c-considerations.md
Last updated 05/07/2020
-# Customer intent: As an application developer, I want to learn about specific considerations when using
-# Azure AD B2C and MSAL.NET so I can decide if this platform meets my application development
-# needs and requirements.
+# Customer intent: As an application developer, I want to learn about specific considerations when using Azure AD B2C and MSAL.NET so I can decide if this platform meets my application development needs and requirements.
# Use MSAL.NET to sign in users with social identities
active-directory Msal Net Client Assertions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-client-assertions.md
string GetSignedClientAssertion()
var header = new Dictionary<string, string>() { { "alg", "RS256"},
- { "kid", Encode(Certificate.GetCertHash()) }
+ { "kid", Encode(certificate.GetCertHash()) }
}; //Please see the previous code snippet on how to craft claims for the GetClaims() method
- string token = Encode(Encoding.UTF8.GetBytes(JObject.FromObject(header).ToString())) + "." + Encode(Encoding.UTF8.GetBytes(JObject.FromObject(GetClaims())));
+ string token = Encode(Encoding.UTF8.GetBytes(JObject.FromObject(header).ToString())) + "." + Encode(Encoding.UTF8.GetBytes(JObject.FromObject(GetClaims()).ToString()));
string signature = Encode(rsa.SignData(Encoding.UTF8.GetBytes(token), new SHA256Cng())); string signedClientAssertion = string.Concat(token, ".", signature);
active-directory Quickstart Register App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-register-app.md
Last updated 09/03/2020
-# Customer intent: As an enterprise developer or software-as-a-service (SaaS) provider, I want to
-# know how to register my application with the Microsoft identity platform so that the security
-# token service can issue ID and/or access tokens to clients that want to access it.
+# Customer intent: As an enterprise developer or software-as-a-service (SaaS) provider, I want to know how to register my application with the Microsoft identity platform so that the security token service can issue ID and/or access tokens to clients that want to access it.
# Quickstart: Register an application with the Microsoft identity platform
active-directory Quickstart V2 Angular https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-angular.md
In this quickstart, you download and run a code sample that demonstrates how an
> > To find the values of **Application (client) ID**, **Directory (tenant) ID**, and **Supported account types**, go to the app's **Overview** page in the Azure portal.
-For more information about available configurable options, see [Initialize client applications](msal-js-initializing-client-applications.md).
+> For more information about available configurable options, see [Initialize client applications](msal-js-initializing-client-applications.md).
-You can find the source code for the MSAL.js library in the [AzureAD/microsoft-authentication-library-for-js](https://github.com/AzureAD/microsoft-authentication-library-for-js) repository on GitHub.
+> You can find the source code for the MSAL.js library in the [AzureAD/microsoft-authentication-library-for-js](https://github.com/AzureAD/microsoft-authentication-library-for-js) repository on GitHub.
->[!div class="sxs-lookup" renderon="portal"]
->#### Step 3: Run the project
+> [!div class="sxs-lookup" renderon="portal"]
+> #### Step 3: Your app is configured and ready to run
+> We have configured your project with values of your app's properties.
+
+> [!div renderon="docs"]
+>
+> Scroll down in the same file and update the `graphMeEndpoint`.
+> - Replace the string `Enter_the_Graph_Endpoint_Herev1.0/me` with `https://graph.microsoft.com/v1.0/me`
+> - `Enter_the_Graph_Endpoint_Herev1.0/me` is the endpoint that API calls will be made against. For the main (global) Microsoft Graph API service, enter `https://graph.microsoft.com/` (include the trailing forward-slash). For more information, see the [documentation](https://docs.microsoft.com/graph/deployments).
+>
+>
+> ```javascript
+> protectedResourceMap: [
+> ['Enter_the_Graph_Endpoint_Herev1.0/me', ['user.read']]
+> ],
+> ```
+>
+>
>[!div renderon="docs"] >#### Step 4: Run the project
active-directory Reference V2 Libraries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/reference-v2-libraries.md
Previously updated : 01/29/2021 Last updated : 03/30/2021
-# Customer intent: As a developer, I want to know whether there's a Microsoft Authentication Library (MSAL) available for
-# the language/framework I'm using to build my application, and whether the library is GA or in preview.
+# Customer intent: As a developer, I want to know whether there's a Microsoft Authentication Library (MSAL) available for the language/framework I'm using to build my application, and whether the library is GA or in preview.
# Microsoft identity platform authentication libraries
If you choose to hand-code your own protocol-level implementation of [OAuth 2.0
## Single-page application (SPA)
-A single-page application runs entirely on the browser surface and fetches page data (HTML, CSS, and JavaScript) dynamically or at application load time. It can call web APIs to interact with back-end data sources.
+A single-page application runs entirely in the browser and fetches page data (HTML, CSS, and JavaScript) dynamically or at application load time. It can call web APIs to interact with back-end data sources.
Because a SPA's code runs entirely in the browser, it's considered a *public client* that's unable to store secrets securely.
Because a web application's code runs on the web server, it's considered a *conf
## Desktop application
-A desktop application is typically binary (compiled) code that surfaces a user interface and is intended to run on a user's desktop.
+A desktop application is typically binary (compiled) code that displays a user interface and is intended to run on a user's desktop.
Because a desktop application runs on the user's desktop, it's considered a *public client* that's unable to store secrets securely.
Because a desktop application runs on the user's desktop, it's considered a *pub
## Mobile application
-A mobile application is typically binary (compiled) code that surfaces a user interface and is intended to run on a user's mobile device.
+A mobile application is typically binary (compiled) code that displays a user interface and is intended to run on a user's mobile device.
Because a mobile application runs on the user's mobile device, it's considered a *public client* that's unable to store secrets securely.
active-directory Scenario Spa Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-spa-overview.md
Learn all you need to build a single-page application (SPA).
If you haven't already, create your first app by completing the JavaScript SPA quickstart:
-[Quickstart: Single-page application](./quickstart-v2-javascript.md)
+[Quickstart: Single-page application](./quickstart-v2-javascript-auth-code.md)
## Overview
active-directory Scenario Spa Sign In https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-spa-sign-in.md
function handleResponse(response) {
} }
-myMsal.handleRedirectPromise(handleResponse);
+myMsal.handleRedirectPromise().then(handleResponse);
myMsal.loginRedirect(loginRequest); ```
active-directory Troubleshoot Hybrid Join Windows Current https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/devices/troubleshoot-hybrid-join-windows-current.md
Use Event Viewer logs to locate the phase and errorcode for the join failures.
- Reason: The connection with the server was terminated abnormally. - Resolution: Retry after sometime or try joining from an alternate stable network location.
+##### Other Errors
+
+- **DSREG_AUTOJOIN_ADCONFIG_READ_FAILED** (0x801c001d/-2145648611)
+ - Reason: EventID 220 is present in User Device Registration event logs. Windows cannot access the computer object in Active Directory. A Windows error code may be included in the event. For error codes ERROR_NO_SUCH_LOGON_SESSION (1312) and ERROR_NO_SUCH_USER (1317), these are related to replication issues in on-premises AD.
+ - Resolution: Troubleshoot replication issues in AD. Replication issues may be transient and may go way after a period of time.
+ ##### Federated join server Errors | Server error code | Server error message | Possible reasons | Resolution |
active-directory Directory Overview User Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/directory-overview-user-model.md
-#As a new Azure AD identity administrator, user management is at the core of my work so I need to understand the user management tools such as groups, administrator roles, and licenses to manage users.
+#Customer intent: As a new Azure AD identity administrator, user management is at the core of my work so I need to understand the user management tools such as groups, administrator roles, and licenses to manage users.
# What is enterprise user management?
active-directory Groups Dynamic Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-dynamic-tutorial.md
Last updated 12/02/2020
-#As a new Azure AD identity administrator, I want to automatically add or remove users, so I don't have to manually do it."
+#Customer intent: As a new Azure AD identity administrator, I want to automatically add or remove users, so I don't have to manually do it."
active-directory Groups Quickstart Expiration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-quickstart-expiration.md
Last updated 12/02/2020
-#As a new Azure AD identity administrator, I want user-created Microsoft 365 groups in my organization to expire so I can reduce the number of unused groups.
+#Customer intent: As a new Azure AD identity administrator, I want user-created Microsoft 365 groups in my organization to expire so I can reduce the number of unused groups.
# Quickstart: Set Microsoft 365 groups to expire in Azure Active Directory
active-directory Groups Quickstart Naming Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-quickstart-naming-policy.md
Last updated 12/02/2020
-#As an Azure AD identity administrator, I want to enforce naming policy on self-service groups, to help me sort and search in my Azure AD organizationΓÇÖs user-created groups.
+#Customer intent: As an Azure AD identity administrator, I want to enforce naming policy on self-service groups, to help me sort and search in my Azure AD organizationΓÇÖs user-created groups.
active-directory Users Revoke Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/users-revoke-access.md
Previously updated : 12/02/2020 Last updated : 03/29/2021 # Revoke user access in Azure Active Directory
-Among the scenarios that could require an administrator to revoke all access for a user include compromised accounts, employee termination, and other insider threats. Depending on the complexity of the environment, administrators can take several steps to ensure access is revoked. In some scenarios, there could be a period between initiation of access revocation and when access is effectively revoked.
+Scenarios that could require an administrator to revoke all access for a user include compromised accounts, employee termination, and other insider threats. Depending on the complexity of the environment, administrators can take several steps to ensure access is revoked. In some scenarios, there could be a period between the initiation of access revocation and when access is effectively revoked.
To mitigate the risks, you must understand how tokens work. There are many kinds of tokens, which fall into one of the patterns mentioned in the sections below.
Access tokens and refresh tokens are frequently used with thick client applicati
- Access tokens issued by Azure AD by default last for 1 hour. If the authentication protocol allows, the app can silently reauthenticate the user by passing the refresh token to the Azure AD when the access token expires.
-Azure AD then reevaluates its authorization policies. If the user is still authorized, Azure AD issues a new access token and refresh token.
+Azure AD then reevaluates its authorization policies. If the user is still authorized, Azure AD issues a new access token and refreshes token.
-Access tokens can be a security concern if access must be revoked within a time that is shorter than the lifetime of the token, which is usually around an hour. For this reason, Microsoft is actively working to bring [continuous access evaluation](../conditional-access/concept-continuous-access-evaluation.md) to Microsoft 365 applications, which helps ensure invalidation of access tokens in near real time.
+Access tokens can be a security concern if access must be revoked within a time that is shorter than the lifetime of the token, which is usually around an hour. For this reason, Microsoft is actively working to bring [continuous access evaluation](https://docs.microsoft.com/azure/active-directory/fundamentals/concept-fundamentals-continuous-access-evaluation) to Office 365 applications, which helps ensure invalidation of access tokens in near real time.
## Session tokens (cookies)
For a hybrid environment with on-premises Active Directory synchronized with Azu
As an admin in the Active Directory, connect to your on-premises network, open PowerShell, and take the following actions:
-1. Disable the user in Active Directory. Refer to [Disable-ADAccount](/powershell/module/addsadministration/disable-adaccount).
+1. Disable the user in Active Directory. Refer to [Disable-ADAccount](https://docs.microsoft.com/powershell/module/addsadministration/disable-adaccount?view=win10-ps).
```PowerShell Disable-ADAccount -Identity johndoe ```
-1. Reset the userΓÇÖs password twice in the Active Directory. Refer to [Set-ADAccountPassword](/powershell/module/addsadministration/set-adaccountpassword).
+2. Reset the userΓÇÖs password twice in the Active Directory. Refer to [Set-ADAccountPassword](https://docs.microsoft.com/powershell/module/addsadministration/set-adaccountpassword?view=win10-ps).
> [!NOTE] > The reason for changing a userΓÇÖs password twice is to mitigate the risk of pass-the-hash, especially if there are delays in on-premises password replication. If you can safely assume this account isn't compromised, you may reset the password only once.
- > [!IMPORTANT]
+ > [!IMPORTANT]
> Don't use the example passwords in the following cmdlets. Be sure to change the passwords to a random string. ```PowerShell
As an admin in the Active Directory, connect to your on-premises network, open P
As an administrator in Azure Active Directory, open PowerShell, run ``Connect-AzureAD``, and take the following actions:
-1. Disable the user in Azure AD. Refer to [Set-AzureADUser](/powershell/module/azuread/Set-AzureADUser).
+1. Disable the user in Azure AD. Refer to [Set-AzureADUser](https://docs.microsoft.com/powershell/module/azuread/Set-AzureADUser?view=azureadps-2.0).
```PowerShell Set-AzureADUser -ObjectId johndoe@contoso.com -AccountEnabled $false ```
-1. Revoke the userΓÇÖs Azure AD refresh tokens. Refer to [Revoke-AzureADUserAllRefreshToken](/powershell/module/azuread/revoke-azureaduserallrefreshtoken).
+
+2. Revoke the userΓÇÖs Azure AD refresh tokens. Refer to [Revoke-AzureADUserAllRefreshToken](https://docs.microsoft.com/powershell/module/azuread/revoke-azureaduserallrefreshtoken?view=azureadps-2.0).
```PowerShell Revoke-AzureADUserAllRefreshToken -ObjectId johndoe@contoso.com ```
-1. Disable the userΓÇÖs devices. Refer to [Get-AzureADUserRegisteredDevice](/powershell/module/azuread/get-azureaduserregistereddevice).
+3. Disable the userΓÇÖs devices. Refer to [Get-AzureADUserRegisteredDevice](https://docs.microsoft.com/powershell/module/azuread/get-azureaduserregistereddevice?view=azureadps-2.0).
```PowerShell Get-AzureADUserRegisteredDevice -ObjectId johndoe@contoso.com | Set-AzureADDevice -AccountEnabled $false ```
+## When access is revoked
-## Optional steps
+Once admins have taken the above steps, the user can't gain new tokens for any application tied to Azure Active Directory. The elapsed time between revocation and the user losing their access depends on how the application is granting access:
-- [Wipe corporate data from Intune-managed applications](/mem/intune/apps/apps-selective-wipe).
+- For **applications using access tokens**, the user loses access when the access token expires.
-- [Wipe corporate owned devices be resetting device to factory default settings](/mem/intune/remote-actions/devices-wipe).
+- For **applications that use session tokens**, the existing sessions end as soon as the token expires. If the disabled state of the user is synchronized to the application, the application can automatically revoke the userΓÇÖs existing sessions if it's configured to do so. The time it takes depends on the frequency of synchronization between the application and Azure AD.
-> [!NOTE]
-> Data on the device cannot be recovered after a wipe.
+## Best practices
-## When access is revoked
+- Deploy an automated provisioning and deprovisioning solution. Deprovisioning users from applications is an effective way of revoking access, especially for applications that use sessions tokens. Develop a process to deprovision users to apps that donΓÇÖt support automatic provisioning and deprovisioning. Ensure applications revoke their own session tokens and stop accepting Azure AD access tokens even if theyΓÇÖre still valid.
-Once admins have taken the above steps, the user can't gain new tokens for any application tied to Azure Active Directory. The elapsed time between revocation and the user losing their access depends on how the application is granting access:
+ - Use [Azure AD SaaS App Provisioning](https://docs.microsoft.com/azure/active-directory/app-provisioning/user-provisioning). Azure AD SaaS App Provisioning typically runs automatically every 20-40 minutes. [Configure Azure AD provisioning](https://docs.microsoft.com/azure/active-directory/saas-apps/tutorial-list) to deprovision or deactivate disabled users in applications.
+
+ - For applications that donΓÇÖt use Azure AD SaaS App Provisioning, use [Identity Manager (MIM)](https://docs.microsoft.com/microsoft-identity-manager/mim-how-provision-users-adds) or a 3rd party solution to automate the deprovisioning of users.
+ - Identify and develop a process for applications that requires manual deprovisioning. Ensure admins can quickly run the required manual tasks to deprovision the user from these apps when needed.
+
+- [Manage your devices and applications with Microsoft Intune](https://docs.microsoft.com/mem/intune/remote-actions/device-management). Intune-managed [devices can be reset to factory settings](https://docs.microsoft.com/mem/intune/remote-actions/devices-wipe). If the device is unmanaged, you can [wipe the corporate data from managed apps](https://docs.microsoft.com/mem/intune/apps/apps-selective-wipe). These processes are effective for removing potentially sensitive data from end usersΓÇÖ devices. However, for either process to be triggered, the device must be connected to the internet. If the device is offline, the device will still have access to any locally stored data.
-- For **applications using access tokens**, the user loses access when the access token expires.
+> [!NOTE]
+> Data on the device cannot be recovered after a wipe.
-- For **applications that use session tokens**, the existing sessions end as soon as the token expires. If the disabled state of the user is synchronized to the application, the application can automatically revoke the userΓÇÖs existing sessions if it's configured to do so. The time it takes depends on the frequency of synchronization between the application and Azure AD.
+- Use [Microsoft Cloud App Security (MCAS) to block data download](https://docs.microsoft.com/cloud-app-security/use-case-proxy-block-session-aad) when appropriate. If the data can only be accessed online, organizations can monitor sessions and achieve real-time policy enforcement.
+
+- Enable [Continuous Access Evaluation (CAE) in Azure AD](https://docs.microsoft.com/azure/active-directory/conditional-access/concept-continuous-access-evaluation). CAE allows admins to revoke the session tokens and access tokens for applications that are CAE capable.
## Next steps -- [Secure access practices for Azure AD administrators](../roles/security-planning.md)
+- [Secure access practices for Azure AD administrators](https://docs.microsoft.com/azure/active-directory/roles/security-planning)
- [Add or update user profile information](../fundamentals/active-directory-users-profile-azure-portal.md)
active-directory B2b Quickstart Invite Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/external-identities/b2b-quickstart-invite-powershell.md
-#customer intent: As a tenant admin, I want to walk through the B2B invitation workflow so that I can understand how to add a user through PowerShell.
+# Customer intent: As a tenant admin, I want to walk through the B2B invitation workflow so that I can understand how to add a user through PowerShell.
active-directory Bulk Invite Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/external-identities/bulk-invite-powershell.md
-#customer intent: As a tenant administrator, I want to send B2B invitations to multiple external users at the same time so that I can avoid having to send individual invitations to each user.
+# Customer intent: As a tenant administrator, I want to send B2B invitations to multiple external users at the same time so that I can avoid having to send individual invitations to each user.
active-directory Tutorial Bulk Invite https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/external-identities/tutorial-bulk-invite.md
-#customer intent: As a tenant administrator, I want to send B2B invitations to multiple external users at the same time so that I can avoid having to send individual invitations to each user.
+# Customer intent: As a tenant administrator, I want to send B2B invitations to multiple external users at the same time so that I can avoid having to send individual invitations to each user.
active-directory Active Directory Whatis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/active-directory-whatis.md
Last updated 06/05/2020
-#customer intent: As a new administrator, I want to understand what Azure Active Directory is, which license is right for me, and what features are available.
+# Customer intent: As a new administrator, I want to understand what Azure Active Directory is, which license is right for me, and what features are available.
active-directory Frontline Worker Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/frontline-worker-management.md
+
+ Title: Frontline worker management - Azure Active Directory
+description: Learn about frontline worker management capabilities that are provided through the My Staff portal.
+++++ Last updated : 03/16/2021++++++
+#Customer intent: As a manager of frontline workers, I want an intuitive portal so that I can easily onboard new workers and provision shared devices.
+
+# Frontline worker management
+
+Frontline workers account for over 80 percent of the global workforce. Yet because of high scale, rapid turnover, and fragmented processes, frontline workers often lack the tools to make their demanding jobs a little easier. Frontline worker management brings digital transformation to the entire frontline workforce. The workforce may include managers, frontline workers, operations, and IT.
+
+Frontline worker management empowers the frontline workforce by making the following activities easier to accomplish:
+- Streamlining common IT tasks with My Staff
+- Easy onboarding of frontline workers through simplified authentication
+- Seamless provisioning of shared devices and secure sign-out of frontline workers
+
+## Delegated user management through My Staff
+
+Azure Active Directory (Azure AD) provides the ability to delegate user management to frontline managers through the [My Staff portal](../roles/my-staff-configure.md), helping save valuable time and reduce risks. By enabling simplified password resets and phone management directly from the store or factory floor, managers can grant access to employees without routing the request through the help-desk, IT, or operations.
+
+![Delegated user management in the My Staff portal](media/concept-fundamentals-frontline-worker/delegated-user-management.png)
+
+## Accelerated onboarding with simplified authentication
+
+My Staff also enables frontline managers to register their team members' phone numbers for [SMS sign-in](../authentication/howto-authentication-sms-signin.md). In many verticals, frontline workers maintain a local username and password combination, a solution that is often cumbersome, expensive, and error-prone. When IT enables authentication using SMS sign-in, frontline workers can log in with [single sign-on (SSO)](../manage-apps/what-is-single-sign-on.md) for Microsoft Teams and other apps using just their phone number and a one-time passcode (OTP) sent via SMS. This makes signing in for frontline workers simple and secure, delivering quick access to the apps they need most.
+
+![SMS sign-in](media/concept-fundamentals-frontline-worker/sms-signin.png)
+
+Frontline managers can also use Managed Home Screen (MHS) application to allow workers to have access to a specific set of applications on their Intune-enrolled Android dedicated devices. The dedicated devices are enrolled with [Azure AD shared device mode](../develop/msal-shared-devices.md). When configured in multi-app kiosk mode in the Microsoft Endpoint Manager (MEM) console, MHS is automatically launched as the default home screen on the device and appears to the end user as the *only* home screen. To learn more, see how to [configure the Microsoft Managed Home Screen app for Android Enterprise](/mem/intune/apps/app-configuration-managed-home-screen-app).
+
+## Secure sign-out of frontline workers from shared devices
+
+Many companies use shared devices so frontline workers can do inventory management and point-of-sale transactions, without the IT burden of provisioning and tracking individual devices. With shared device sign-out, it's easy for a frontline worker to securely sign out of all apps on any shared device before handing it back to a hub or passing it off to a teammate on the next shift. Microsoft Teams is one of the apps that is currently supported on shared devices and it allows frontline workers to view tasks that are assigned to them. Once a worker signs out of a shared device, Intune and Azure AD clear all of the company data so the device can safely be handed off to the next associate. You can choose to integrate this capability into all your line-of-business [iOS](../develop/msal-ios-shared-devices.md) and [Android](../develop/msal-android-shared-devices.md) apps using the [Microsoft Authentication Library](../develop/msal-overview.md).
+
+![Shared device sign-out](media/concept-fundamentals-frontline-worker/shared-device-signout.png)
+
+## Next steps
+
+- For more information on delegated user management, see [My Staff user documentation](../user-help/my-staff-team-manager.md).
+- For inbound user provisioning from SAP SuccessFactors, see the tutorial on [configuring SAP SuccessFactors to Active Directory user provisioning](../saas-apps/sap-successfactors-inbound-provisioning-tutorial.md).
+- For inbound user provisioning from Workday, see the tutorial on [configuring Workday for automatic user provisioning](../saas-apps/workday-inbound-tutorial.md).
active-directory Entitlement Management Catalog Create https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/governance/entitlement-management-catalog-create.md
To include resources in an access package, the resources must exist in a catalog
These resources can now be included in access packages within the catalog.
-### Add a Multi-geo SharePoint Site
+### Add a Multi-geo SharePoint Site (Preview)
1. If you have [Multi-Geo](/microsoft-365/enterprise/multi-geo-capabilities-in-onedrive-and-sharepoint-online-in-microsoft-365) enabled for SharePoint, select the environment you would like to select sites from.
You can also delete a catalog using Microsoft Graph. A user in an appropriate r
## Next steps -- [Delegate access governance to access package managers](entitlement-management-delegate-managers.md)
+- [Delegate access governance to access package managers](entitlement-management-delegate-managers.md)
active-directory How To Connect Install Automatic Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/how-to-connect-install-automatic-upgrade.md
# Azure AD Connect: Automatic upgrade
-This feature was introduced with build [1.1.105.0 (released February 2016)](reference-connect-version-history.md). This feature was updated in [build 1.1.561](reference-connect-version-history.md) and now supports additional scenarios that were previously not supported.
+Azure AD Connect automatic upgrade is a feature that regularly checks for newer versions of Azure AD Connect. If your server is enabled for automatic upgrade and a newer version is found for which your server is eligible, it will perform an automatic upgrade to that newer version.
+Note that for security reasons the agent that performs the automatic upgrade validates the new build of Azure AD Connect based on the digital signature of the downloaded version.
## Overview Making sure your Azure AD Connect installation is always up to date has never been easier with the **automatic upgrade** feature. This feature is enabled by default for express installations and DirSync upgrades. When a new version is released, your installation is automatically upgraded.
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/reference-connect-version-history.md
Please follow this link to read more about [auto upgrade](how-to-connect-install
## 1.6.2.4
+>[!IMPORTANT]
+> Update per March 30, 2021: we have discovered an issue in this build. After installation of this build, the Health services are not registered. We recommend not installing this build. We will release a hotfix shortly.
+> If you already installed this build, you can manually register the Health services by using the cmdlet as shown in [this article](https://docs.microsoft.com/azure/active-directory/hybrid/how-to-connect-health-agent-install#manually-register-azure-ad-connect-health-for-sync)
>[!NOTE] > - This release will be made available for download only.
active-directory Plan Sso Deployment https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/plan-sso-deployment.md
-#customer intent: As an IT admin, I need to learn about single-sign on (SSO) so I can understand the feature and help others in my organization to understand its value.
+# Customer intent: As an IT admin, I need to learn about single-sign on (SSO) so I can understand the feature and help others in my organization to understand its value.
# Plan a single sign-on deployment
active-directory How Managed Identities Work Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/how-managed-identities-work-vm.md
Last updated 06/11/2020 -
-#As a developer, I'd like to ...
active-directory Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/overview.md
Last updated 10/06/2020 -
-#As a developer, I'd like to securely manage the credentials that my application uses for authenticating to cloud services without having the credentials in my code or checked into source control.
+
+#Customer intent: As a developer, I'd like to securely manage the credentials that my application uses for authenticating to cloud services without having the credentials in my code or checked into source control.
# What are managed identities for Azure resources?
active-directory Delegate App Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/delegate-app-roles.md
Last updated 11/04/2020
-#As an Azure AD administrator, I want to reduce overusing the Global Administrator role by delegating app access management to lower-privilege roles.
-
+#Customer intent: As an Azure AD administrator, I want to reduce overusing the Global Administrator role by delegating app access management to lower-privilege roles.
+ # Delegate app registration permissions in Azure Active Directory
active-directory Delegate By Task https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/delegate-by-task.md
Last updated 11/05/2020
-#As an Azure AD administrator, I want to know which role has the least privilege for a given task to make my Azure AD organization more secure.
-
+#Customer intent: As an Azure AD administrator, I want to know which role has the least privilege for a given task to make my Azure AD organization more secure.
# Administrator roles by admin task in Azure Active Directory
active-directory M365 Workload Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/m365-workload-docs.md
Last updated 11/05/2020
-#As an Azure AD administrator, to delegate permissions across Microsoft 365 services quickly and accurately I want to know where the content is for admin roles.
+#Customer intent: As an Azure AD administrator, to delegate permissions across Microsoft 365 services quickly and accurately I want to know where the content is for admin roles.
# Roles for Microsoft 365 services in Azure Active Directory
active-directory Proware Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/proware-provisioning-tutorial.md
+
+ Title: 'Tutorial: Configure Proware for automatic user provisioning with Azure Active Directory | Microsoft Docs'
+description: Learn how to automatically provision and de-provision user accounts from Azure AD to Proware.
+
+documentationcenter: ''
+
+writer: Zhchia
++
+ms.assetid: 8887932e-e27e-419b-aa85-a0cda428d525
+++
+ na
+ms.devlang: na
+ Last updated : 03/30/2021+++
+# Tutorial: Configure Proware for automatic user provisioning
+
+This tutorial describes the steps you need to perform in both Proware and Azure Active Directory (Azure AD) to configure automatic user provisioning. When configured, Azure AD automatically provisions and de-provisions users and groups to [Proware](https://www.metaware.nl/Proware) using the Azure AD Provisioning service. For important details on what this service does, how it works, and frequently asked questions, see [Automate user provisioning and deprovisioning to SaaS applications with Azure Active Directory](../manage-apps/user-provisioning.md).
++
+## Capabilities Supported
+> [!div class="checklist"]
+> * Create users in Proware
+> * Remove users in Proware when they do not require access anymore
+> * Keep user attributes synchronized between Azure AD and Proware
+> * [Single sign-on](https://docs.microsoft.com/azure/active-directory/saas-apps/proware-tutorial) to Proware (recommended)
+
+## Prerequisites
+
+The scenario outlined in this tutorial assumes that you already have the following prerequisites:
+
+* [An Azure AD tenant](https://docs.microsoft.com/azure/active-directory/develop/quickstart-create-new-tenant)
+* A user account in Azure AD with [permission](https://docs.microsoft.com/azure/active-directory/users-groups-roles/directory-assign-admin-roles) to configure provisioning (for example, Application Administrator, Cloud Application administrator, Application Owner, or Global Administrator).
+* A [Proware](https://www.metaware.nl/Proware) subscription.
+* A user account in Proware with Administrator access.
++
+## Step 1. Plan your provisioning deployment
+1. Learn about [how the provisioning service works](https://docs.microsoft.com/azure/active-directory/manage-apps/user-provisioning).
+2. Determine who will be in [scope for provisioning](https://docs.microsoft.com/azure/active-directory/manage-apps/define-conditional-rules-for-provisioning-user-accounts).
+3. Determine what data to [map between Azure AD and Proware](https://docs.microsoft.com/azure/active-directory/manage-apps/customize-application-attributes).
+
+## Step 2. Configure Proware to support provisioning with Azure AD
+1. Sign in to the [Proware](https://www.metaware.nl/Proware) application.
+2. Navigate to **Control panel** -> **Admin**.
+3. Select **Control panel settings**, scroll down to **User Provisioning** and then **enable** User Provisioning.
+4. Click on the **Create bearer token** button and copy the **Token**. This value will be entered in the Secret Token field in the Provisioning tab of your Proware application in the Azure portal.
+5. Copy the **Tenant URL**. This value will be entered in the Tenant URL field in the Provisioning tab of your Proware application in the Azure portal.
+
+## Step 3. Add Proware from the Azure AD application gallery
+
+Add Proware from the Azure AD application gallery to start managing provisioning to Proware. If you have previously setup Proware for SSO, you can use the same application. However it is recommended that you create a separate app when testing out the integration initially. Learn more about adding an application from the gallery [here](https://docs.microsoft.com/azure/active-directory/manage-apps/add-gallery-app).
+
+## Step 4. Define who will be in scope for provisioning
+
+The Azure AD provisioning service allows you to scope who will be provisioned based on assignment to the application and or based on attributes of the user / group. If you choose to scope who will be provisioned to your app based on assignment, you can use the following [steps](../manage-apps/assign-user-or-group-access-portal.md) to assign users and groups to the application. If you choose to scope who will be provisioned based solely on attributes of the user or group, you can use a scoping filter as described [here](https://docs.microsoft.com/azure/active-directory/manage-apps/define-conditional-rules-for-provisioning-user-accounts).
+
+* When assigning users and groups to Proware, you must select a role other than **Default Access**. Users with the Default Access role are excluded from provisioning and will be marked as not effectively entitled in the provisioning logs. If the only role available on the application is the default access role, you can [update the application manifest](https://docs.microsoft.com/azure/active-directory/develop/howto-add-app-roles-in-azure-ad-apps) to add additional roles.
+
+* Start small. Test with a small set of users and groups before rolling out to everyone. When scope for provisioning is set to assigned users and groups, you can control this by assigning one or two users or groups to the app. When scope is set to all users and groups, you can specify an [attribute based scoping filter](https://docs.microsoft.com/azure/active-directory/manage-apps/define-conditional-rules-for-provisioning-user-accounts).
++
+## Step 5. Configure automatic user provisioning to Proware
+
+This section guides you through the steps to configure the Azure AD provisioning service to create, update, and disable users and/or groups in TestApp based on user and/or group assignments in Azure AD.
+
+### To configure automatic user provisioning for Proware in Azure AD:
+
+1. Sign in to the [Azure portal](https://portal.azure.com). Select **Enterprise Applications**, then select **All applications**.
+
+ ![Enterprise applications blade](common/enterprise-applications.png)
+
+2. In the applications list, select **Proware**.
+
+ ![The Proware link in the Applications list](common/all-applications.png)
+
+3. Select the **Provisioning** tab.
+
+ ![Provisioning tab](common/provisioning.png)
+
+4. Set the **Provisioning Mode** to **Automatic**.
+
+ ![Provisioning tab automatic](common/provisioning-automatic.png)
+
+5. Under the **Admin Credentials** section, input your Proware Tenant URL and Secret Token. Click **Test Connection** to ensure Azure AD can connect to Proware. If the connection fails, ensure your Proware account has Admin permissions and try again.
+
+ ![Token](common/provisioning-testconnection-tenanturltoken.png)
+
+6. In the **Notification Email** field, enter the email address of a person or group who should receive the provisioning error notifications and select the **Send an email notification when a failure occurs** check box.
+
+ ![Notification Email](common/provisioning-notification-email.png)
+
+7. Select **Save**.
+
+8. Under the **Mappings** section, select **Synchronize Azure Active Directory Users to Proware**.
+
+9. Review the user attributes that are synchronized from Azure AD to Proware in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the user accounts in Proware for update operations. If you choose to change the [matching target attribute](https://docs.microsoft.com/azure/active-directory/manage-apps/customize-application-attributes), you will need to ensure that the Proware API supports filtering users based on that attribute. Select the **Save** button to commit any changes.
+
+ |Attribute|Type|Supported for Filtering|
+ |||--|
+ |userName|String|&check;|
+ |active|Boolean|
+ |title|String|
+ |externalId|String|
+ |name.givenName|String|
+ |name.familyName|String|
+ |name.formatted|String|
+ |urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:department|String|
+
+10. To configure scoping filters, refer to the following instructions provided in the [Scoping filter tutorial](../manage-apps/define-conditional-rules-for-provisioning-user-accounts.md).
+
+11. To enable the Azure AD provisioning service for Proware, change the **Provisioning Status** to **On** in the **Settings** section.
+
+ ![Provisioning Status Toggled On](common/provisioning-toggle-on.png)
+
+12. Define the users and/or groups that you would like to provision to Proware by choosing the desired values in **Scope** in the **Settings** section.
+
+ ![Provisioning Scope](common/provisioning-scope.png)
+
+13. When you are ready to provision, click **Save**.
+
+ ![Saving Provisioning Configuration](common/provisioning-configuration-save.png)
+
+This operation starts the initial synchronization cycle of all users and groups defined in **Scope** in the **Settings** section. The initial cycle takes longer to perform than subsequent cycles, which occur approximately every 40 minutes as long as the Azure AD provisioning service is running.
+
+## Step 6. Monitor your deployment
+Once you've configured provisioning, use the following resources to monitor your deployment:
+
+1. Use the [provisioning logs](https://docs.microsoft.com/azure/active-directory/reports-monitoring/concept-provisioning-logs) to determine which users have been provisioned successfully or unsuccessfully
+2. Check the [progress bar](https://docs.microsoft.com/azure/active-directory/app-provisioning/application-provisioning-when-will-provisioning-finish-specific-user) to see the status of the provisioning cycle and how close it is to completion
+3. If the provisioning configuration seems to be in an unhealthy state, the application will go into quarantine. Learn more about quarantine states [here](https://docs.microsoft.com/azure/active-directory/manage-apps/application-provisioning-quarantine-status).
+
+## Additional resources
+
+* [Managing user account provisioning for Enterprise Apps](../manage-apps/configure-automatic-user-provisioning-portal.md)
+* [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+
+## Next steps
+
+* [Learn how to review logs and get reports on provisioning activity](../manage-apps/check-status-user-account-provisioning.md)
aks Azure Disk Csi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/azure-disk-csi.md
test.txt
The default storage classes suit the most common scenarios, but not all. For some cases, you might want to have your own storage class customized with your own parameters. For example, we have a scenario where you might want to change the `volumeBindingMode` class.
-The default storage classes use a `volumeBindingMode: Immediate` class that guarantees that occurs immediately once the PVC is created. In cases where your node pools are topology constrained, for example, using availability zones, PVs would be bound or provisioned without knowledge of the pod's scheduling requirements (in this case to be in a specific zone).
+You can use a `volumeBindingMode: Immediate` class that guarantees that occurs immediately once the PVC is created. In cases where your node pools are topology constrained, for example, using availability zones, PVs would be bound or provisioned without knowledge of the pod's scheduling requirements (in this case to be in a specific zone).
-To address this scenario, you can use `volumeBindingMode: WaitForFirstConsumer`, which delays the binding and provisioning of a PV until a pod that uses the PVC is created. In this way, the PV will conform and be provisioned in the availability zone (or other topology) that's specified by the pod's scheduling constraints.
+To address this scenario, you can use `volumeBindingMode: WaitForFirstConsumer`, which delays the binding and provisioning of a PV until a pod that uses the PVC is created. In this way, the PV will conform and be provisioned in the availability zone (or other topology) that's specified by the pod's scheduling constraints. The default storage classes use `volumeBindingMode: WaitForFirstConsumer` class.
Create a file named `sc-azuredisk-csi-waitforfirstconsumer.yaml`, and paste the following manifest. The storage class is the same as our `managed-csi` storage class but with a different `volumeBindingMode` class.
$ kubectl exec -it busybox-azuredisk-0 -- cat c:\mnt\azuredisk\data.txt # on Win
[az-extension-update]: /cli/azure/extension#az-extension-update [az-feature-register]: /cli/azure/feature#az-feature-register [az-feature-list]: /cli/azure/feature#az-feature-list
-[az-provider-register]: /cli/azure/provider#az-provider-register
+[az-provider-register]: /cli/azure/provider#az-provider-register
aks View Metrics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/view-metrics.md
+
+ Title: View cluster metrics for Azure Kubernetes Service (AKS)
+description: View cluster metrics for Azure Kubernetes Service (AKS).
++ Last updated : 03/30/2021++
+# View cluster metrics for Azure Kubernetes Service (AKS)
+
+AKS provides a set of metrics for the control plane, including the API Server and cluster autoscaler, and cluster nodes. These metrics allow you to monitor the health of your cluster and troubleshoot issues. You can view the metrics for your cluster using the Azure portal.
+
+> [!NOTE]
+> These AKS cluster metrics overlap with a subset of the [metrics provided by Kubernetes][kubernetes-metrics].
+
+## View metrics for your AKS cluster using the Azure portal
+
+To view the metrics for your AKS cluster:
+
+1. Sign in to the [Azure portal][azure-portal] and navigate to your AKS cluster.
+1. On the left side under *Monitoring*, select *Metrics*.
+1. Create a chart for the metrics you want to view. For example, create a chart:
+ 1. For *Scope*, choose your cluster.
+ 1. For *Metric Namespace*, choose *Container service (managed) standard metrics*.
+ 1. For *Metric*, under *Pods* choose *Number of Pods by phase*.
+ 1. For *Aggregation* choose *Avg*.
++
+The above example shows the metrics for the average number of pods for the *myAKSCluster*.
+
+## Available metrics
+
+The following cluster metrics are available:
+
+| Name | Group | ID | Description |
+| | | | - |
+| Inflight Requests | API Server (preview) |apiserver_current_inflight_requests | Maximum number of currently active inflight requests on the API Server per request kind. |
+| Cluster Health | Cluster Autoscaler (preview) | cluster_autoscaler_cluster_safe_to_autoscale | Determines whether or not cluster autoscaler will take action on the cluster. |
+| Scale Down Cooldown | Cluster Autoscaler (preview) | cluster_autoscaler_scale_down_in_cooldown | Determines if the scale down is in cooldown - No nodes will be removed during this timeframe. |
+| Unneeded Nodes | Cluster Autoscaler (preview) | cluster_autoscaler_unneeded_nodes_count | Cluster auotscaler marks those nodes as candidates for deletion and are eventually deleted. |
+| Unschedulable Pods | Cluster Autoscaler (preview) | cluster_autoscaler_unschedulable_pods_count | Number of pods that are currently unschedulable in the cluster. |
+| Total number of available cpu cores in a managed cluster | Nodes | kube_node_status_allocatable_cpu_cores | Total number of available CPU cores in a managed cluster. |
+| Total amount of available memory in a managed cluster | Nodes | kube_node_status_allocatable_memory_bytes | Total amount of available memory in a managed cluster. |
+| Statuses for various node conditions | Nodes | kube_node_status_condition | Statuses for various node conditions |
+| CPU Usage Millicores | Nodes (preview) | node_cpu_usage_millicores | Aggregated measurement of CPU utilization in millicores across the cluster. |
+| CPU Usage Percentage | Nodes (preview) | node_cpu_usage_percentage | Aggregated average CPU utilization measured in percentage across the cluster. |
+| Memory RSS Bytes | Nodes (preview) | node_memory_rss_bytes | Container RSS memory used in bytes. |
+| Memory RSS Percentage | Nodes (preview) | node_memory_rss_percentage | Container RSS memory used in percent. |
+| Memory Working Set Bytes | Nodes (preview) | node_memory_working_set_bytes | Container working set memory used in bytes. |
+| Memory Working Set Percentage | Nodes (preview) | node_memory_working_set_percentage | Container working set memory used in percent. |
+| Disk Used Bytes | Nodes (preview) | node_disk_usage_bytes | Disk space used in bytes by device. |
+| Disk Used Percentage | Nodes (preview) | node_disk_usage_percentage | Disk space used in percent by device. |
+| Network In Bytes | Nodes (preview) | node_network_in_bytes | Network received bytes. |
+| Network Out Bytes | Nodes (preview) | node_network_out_bytes | Network transmitted bytes. |
+| Number of pods in Ready state | Pods | kube_pod_status_ready | Number of pods in *Ready* state. |
+| Number of pods by phase | Pods | kube_pod_status_phase | Number of pods by phase. |
+
+> [!IMPORTANT]
+> Metrics in preview can be updated or changed, including their names and descriptions, while in preview.
+
+## Next steps
+
+In addition to the cluster metrics for AKS, you can also use Azure Monitor with your AKS cluster. For more information on using Azure Monitor with AKS, see [Azure Monitor for containers][aks-azure-monitory].
+
+[aks-azure-monitory]: ../azure-monitor/containers/container-insights-overview.md
+[azure-portal]: https://portal.azure.com/
+[kubernetes-metrics]: https://kubernetes.io/docs/concepts/cluster-administration/system-metrics/
app-service Overview Local Cache https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/overview-local-cache.md
# Azure App Service Local Cache overview > [!NOTE]
-> Local cache is not supported in Function apps or containerized App Service apps, such as in [Windows Containers](quickstart-custom-container.md?pivots=container-windows) or on [App Service on Linux](overview.md#app-service-on-linux).
+> Local cache is not supported in function apps or containerized App Service apps, such as in [Windows Containers](quickstart-custom-container.md?pivots=container-windows) or in [App Service on Linux](overview.md#app-service-on-linux). A version of local cache that is available for these app types is [App Cache](https://github.com/Azure-App-Service/KuduLite/wiki/App-Cache).
Azure App Service content is stored on Azure Storage and is surfaced in a durable manner as a content share. This design is intended to work with a variety of apps and has the following attributes:
app-service Quickstart Dotnet Framework https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/quickstart-dotnet-framework.md
- Title: 'Quickstart: Create a C# ASP.NET app'
-description: Learn how to run web apps in Azure App Service by deploying the default C# ASP.NET web app template from Visual Studio.
- Previously updated : 11/20/2020---
-# Create an ASP.NET Framework web app in Azure
-
-[Azure App Service](overview.md) provides a highly scalable, self-patching web hosting service.
-
-This quickstart shows how to deploy your first ASP.NET web app to Azure App Service. When you're finished, you'll have an App Service plan. You'll also have an App Service app with a deployed web application.
--
-## Prerequisites
-
-To complete this tutorial, install <a href="https://www.visualstudio.com/downloads/" target="_blank">Visual Studio 2019</a> with the **ASP.NET and web development** workload.
-
-If you've installed Visual Studio 2019 already:
--- Install the latest updates in Visual Studio by selecting **Help** > **Check for Updates**.-- Add the workload by selecting **Tools** > **Get Tools and Features**.-
-## Create an ASP.NET web app <a name="create-and-publish-the-web-app"></a>
-
-Create an ASP.NET web app by following these steps:
-
-1. Open Visual Studio and then select **Create a new project**.
-
-2. In **Create a new project**, find and choose **ASP.NET Web Application (.NET Framework)**, then select **Next**.
-
-3. In **Configure your new project**, name the application _myFirstAzureWebApp_, and then select **Create**.
-
- ![Configure your web app project](./media/quickstart-dotnet-framework/configure-web-app-project-framework.png)
-
-4. You can deploy any type of ASP.NET web app to Azure. For this quickstart, choose the **MVC** template.
-
-5. Make sure authentication is set to **No Authentication**. Select **Create**.
-
- ![Create ASP.NET Web Application](./media/quickstart-dotnet-framework/select-mvc-template-vs2019.png)
-
-6. From the Visual Studio menu, select **Debug** > **Start Without Debugging** to run the web app locally.
-
- ![Run app locally](./media/quickstart-dotnet-framework/local-web-app.png)
-
-## Publish your web app <a name="launch-the-publish-wizard"></a>
-
-1. In **Solution Explorer**, right-click the **myFirstAzureWebApp** project and select **Publish**.
-
-1. In **Publish**, select **Azure** and click **Next**.
-
-1. Select **Azure App Service (Windows)** and click **Next**.
-
- <!-- ![Publish from project overview page](./media/quickstart-dotnet-framework/publish-app-framework-vs2019.png) -->
-
-1. Your options depend on whether you're signed in to Azure already and whether you have a Visual Studio account linked to an Azure account. Select either **Add an account** or **Sign in** to sign in to your Azure subscription. If you're already signed in, select the account you want.
-
- ![Sign in to Azure](./media/quickstart-dotnet-framework/sign-in-azure-framework-vs2019.png)
-
- [!INCLUDE [resource group intro text](../../includes/resource-group.md)]
-
-1. To the right of **App Service instances**, click **+**.
-
- ![New App Service app](./media/quickstart-dotnet-framework/publish-new-app-service.png)
-
-1. For **Resource group**, select **New**.
-
-1. In **New resource group name**, enter *myResourceGroup* and select **OK**.
-
- [!INCLUDE [app-service-plan](../../includes/app-service-plan.md)]
-
-1. For **Hosting Plan**, select **New**.
-
-1. In the **Hosting Plan** dialog, enter the values from the following table, and then select **OK**.
-
- | Setting | Suggested Value | Description |
- |-|-|-|
- | Hosting Plan| myAppServicePlan | Name of the App Service plan. |
- | Location | West Europe | The datacenter where the web app is hosted. |
- | Size | Free | [Pricing tier](https://azure.microsoft.com/pricing/details/app-service/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) determines hosting features. |
-
- ![Create App Service plan](./media/quickstart-dotnet-framework/app-service-plan-framework-vs2019.png)
-
-1. In **Name**, enter a unique app name that includes only the valid characters are `a-z`, `A-Z`, `0-9`, and `-`. You can accept the automatically generated unique name. The URL of the web app is `http://<app-name>.azurewebsites.net`, where `<app-name>` is your app name.
-
-2. Select **Create** to create the Azure resources.
-
- ![Configure app name](./media/quickstart-dotnet-framework/web-app-name-framework-vs2019.png)
-
- Once the wizard completes, the Azure resources are created for you and you are ready to publish.
-
-3. Select **Finish** to close the wizard.
-
-3. In the **Publish** page, click **Publish**. Visual Studio builds, packages, and publishes the app to Azure, and then launches the app in the default browser.
-
- ![Published ASP.NET web app in Azure](./media/quickstart-dotnet-framework/published-azure-web-app.png)
-
-The app name specified in the **App Service Create new** page is used as the URL prefix in the format `http://<app-name>.azurewebsites.net`.
-
-**Congratulations!** Your ASP.NET web app is running live in Azure App Service.
-
-## Update the app and redeploy
-
-1. In **Solution Explorer**, under your project, open **Views** > **Home** > **Index.cshtml**.
-
-1. Find the `<div class="jumbotron">` HTML tag near the top, and replace the entire element with the following code:
-
- ```html
- <div class="jumbotron">
- <h1>ASP.NET in Azure!</h1>
- <p class="lead">This is a simple app that we've built that demonstrates how to deploy a .NET app to Azure App Service.</p>
- </div>
- ```
-
-1. To redeploy to Azure, right-click the **myFirstAzureWebApp** project in **Solution Explorer** and select **Publish**. Then, select **Publish**.
-
- When publishing completes, Visual Studio launches a browser to the URL of the web app.
-
- ![Updated ASP.NET web app in Azure](./media/quickstart-dotnet-framework/updated-azure-web-app.png)
-
-## Manage the Azure app
-
-1. To manage the web app, go to the [Azure portal](https://portal.azure.com), and search for and select **App Services**.
-
- ![Select App services](./media/quickstart-dotnet-framework/app-services.png)
-
-2. On the **App Services** page, select the name of your web app.
-
- ![Portal navigation to Azure app](./media/quickstart-dotnet-framework/access-portal-framework-vs2019.png)
-
- You see your web app's Overview page. Here, you can do basic management like browse, stop, start, restart, and delete.
-
- ![App Service overview in Azure portal](./media/quickstart-dotnet-framework/web-app-general-framework-vs2019.png)
-
- The left menu provides different pages for configuring your app.
--
-## Next steps
-
-> [!div class="nextstepaction"]
-> [ASP.NET with SQL Database](app-service-web-tutorial-dotnet-sqldatabase.md)
-
-> [!div class="nextstepaction"]
-> [Configure ASP.NET app](configure-language-dotnet-framework.md)
app-service Quickstart Dotnetcore https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/quickstart-dotnetcore.md
Title: "Quickstart: Create a C# ASP.NET Core app"
-description: Learn how to run web apps in Azure App Service by deploying your first ASP.NET core app.
+ Title: "Quickstart: Deploy an ASP.NET web app"
+description: Learn how to run web apps in Azure App Service by deploying your first ASP.NET app.
ms.assetid: b1e6bd58-48d1-4007-9d6c-53fd6db061e3 Previously updated : 11/23/2020 Last updated : 03/30/2021
-zone_pivot_groups: app-service-platform-windows-linux
+zone_pivot_groups: app-service-ide
adobe-target: true adobe-target-activity: DocsExpΓÇô386541ΓÇôA/BΓÇôEnhanced-Readability-QuickstartsΓÇô2.19.2021 adobe-target-experience: Experience B adobe-target-content: ./quickstart-dotnetcore-uiex
-# Quickstart: Create an ASP.NET Core web app in Azure
+<!-- NOTES:
+I'm a .NET developer who wants to deploy my web app to App Service. I may develop apps with
+Visual Studio, Visual Studio for Mac, Visual Studio Code, or the .NET SDK/CLI. This article
+should be able to guide .NET devs, whether they're app is .NET Core, .NET, or .NET Framework.
-In this quickstart, you'll learn how to create and deploy your first ASP.NET Core web app to [Azure App Service](overview.md). App Service supports .NET 5.0 apps.
+As a .NET developer, when choosing an IDE and .NET TFM - you map to various OS requirements.
+For example, if you choose Visual Studio - you're developing the app on Windows, but you can still
+target cross-platform with .NET Core 3.1 or .NET 5.0.
-When you're finished, you'll have an Azure resource group consisting of an App Service hosting plan and an App Service with a deployed web application.
+| .NET / IDE | Visual Studio | Visual Studio for Mac | Visual Studio Code | Command line |
+|--||--|--|-|
+| .NET Core 3.1 | Windows | macOS | Cross-platform | Cross-platform |
+| .NET 5.0 | Windows | macOS | Cross-platform | Cross-platform |
+| .NET Framework 4.8 | Windows | N/A | Windows | Windows |
+
+-->
+
+# Quickstart: Deploy an ASP.NET web app
+
+In this quickstart, you'll learn how to create and deploy your first ASP.NET web app to [Azure App Service](overview.md). App Service supports various versions of .NET apps, and provides a highly scalable, self-patching web hosting service. ASP.NET web apps are cross-platform and can be hosted on Linux or Windows. When you're finished, you'll have an Azure resource group consisting of an App Service hosting plan and an App Service with a deployed web application.
+
+> [!TIP]
+> .NET Core 3.1 is the current long-term support (LTS) release of .NET. For more information, see [.NET support policy](https://dotnet.microsoft.com/platform/support/policy/dotnet-core).
## Prerequisites -- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/dotnet/).-- Install <a href="https://www.visualstudio.com/downloads/" target="_blank">Visual Studio 2019</a> with the **ASP.NET and web development** workload.+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/dotnet).
+- <a href="https://www.visualstudio.com/downloads" target="_blank">Visual Studio 2019</a> with the **ASP.NET and web development** workload.
+
+ If you've already installed Visual Studio 2019:
+
+ - Install the latest updates in Visual Studio by selecting **Help** > **Check for Updates**.
+ - Add the workload by selecting **Tools** > **Get Tools and Features**.
+++
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/dotnet).
+- <a href="https://www.visualstudio.com/downloads" target="_blank">Visual Studio Code</a>.
+- The <a href="https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack" target="_blank">Azure Tools</a> extension.
+
+### [.NET Core 3.1](#tab/netcore31)
+
+<a href="https://dotnet.microsoft.com/download/dotnet-core/3.1" target="_blank">
+ Install the latest .NET Core 3.1 SDK.
+</a>
+
+### [.NET 5.0](#tab/net50)
+
+<a href="https://dotnet.microsoft.com/download/dotnet/5.0" target="_blank">
+ Install the latest .NET 5.0 SDK.
+</a>
+
+### [.NET Framework 4.8](#tab/netframework48)
+
+<a href="https://dotnet.microsoft.com/download/dotnet-framework/net48" target="_blank">
+ Install the .NET Framework 4.8 Developer Pack.
+</a>
+
+> [!NOTE]
+> Visual Studio Code is cross-platform, however; .NET Framework is not. If you're developing .NET Framework apps with Visual Studio Code, consider using a Windows machine to satisfy the build dependencies.
++++
+<!-- markdownlint-disable MD044 -->
+<!-- markdownlint-enable MD044 -->
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/dotnet).
+- The <a href="/cli/azure/install-azure-cli" target="_blank">Azure CLI</a>.
+- The .NET SDK (includes runtime and CLI).
+
+### [.NET Core 3.1](#tab/netcore31)
+
+<a href="https://dotnet.microsoft.com/download/dotnet-core/3.1" target="_blank">
+ Install the latest .NET Core 3.1 SDK.
+</a>
+
+### [.NET 5.0](#tab/net50)
+
+<a href="https://dotnet.microsoft.com/download/dotnet/5.0" target="_blank">
+ Install the latest .NET 5.0 SDK.
+</a>
+
+### [.NET Framework 4.8](#tab/netframework48)
+
+<a href="https://dotnet.microsoft.com/download/dotnet/5.0" target="_blank">
+ Install the latest .NET 5.0 SDK.
+</a> and <a href="https://dotnet.microsoft.com/download/dotnet-framework/net48" target="_blank">
+ the .NET Framework 4.8 Developer Pack.
+</a>
+
+> [!NOTE]
+> The [.NET CLI](/dotnet/core/tools) is cross-platform, however; .NET Framework is not. If you're developing .NET Framework apps with the .NET CLI, consider using a Windows machine to satisfy the build dependencies.
++++
+## Create an ASP.NET web app
++
+### [.NET Core 3.1](#tab/netcore31)
+
+1. Open Visual Studio and then select **Create a new project**.
+1. In **Create a new project**, find, and choose **ASP.NET Web Core App**, then select **Next**.
+1. In **Configure your new project**, name the application _MyFirstAzureWebApp_, and then select **Next**.
+
+ :::image type="content" source="media/quickstart-dotnet/configure-webapp-net.png" alt-text="Configure ASP.NET Core 3.1 web app" border="true":::
+
+1. Select **.NET Core 3.1 (Long-term support)**.
+1. Make sure **Authentication Type** is set to **None**. Select **Create**.
+
+ :::image type="content" source="media/quickstart-dotnet/vs-additional-info-netcoreapp31.png" alt-text="Visual Studio - Select .NET Core 3.1 and None for Authentication Type." border="true":::
+
+1. From the Visual Studio menu, select **Debug** > **Start Without Debugging** to run the web app locally.
+
+ :::image type="content" source="media/quickstart-dotnet/local-webapp-net.png" alt-text="Visual Studio - .NET Core 3.1 browse locally" lightbox="media/quickstart-dotnet/local-webapp-net.png" border="true":::
+
+### [.NET 5.0](#tab/net50)
+
+1. Open Visual Studio and then select **Create a new project**.
+1. In **Create a new project**, find, and choose **ASP.NET Web Core App**, then select **Next**.
+1. In **Configure your new project**, name the application _MyFirstAzureWebApp_, and then select **Next**.
+
+ :::image type="content" source="media/quickstart-dotnet/configure-webapp-net.png" alt-text="Visual Studio - Configure ASP.NET 5.0 web app." border="true":::
+
+1. Select **.NET Core 5.0 (Current)**.
+1. Make sure **Authentication Type** is set to **None**. Select **Create**.
+
+ :::image type="content" source="media/quickstart-dotnet/vs-additional-info-net50.png" alt-text="Visual Studio - Additional info when selecting .NET Core 5.0." border="true":::
+
+1. From the Visual Studio menu, select **Debug** > **Start Without Debugging** to run the web app locally.
+
+ :::image type="content" source="media/quickstart-dotnet/local-webapp-net.png" alt-text="Visual Studio - ASP.NET Core 5.0 running locally." lightbox="media/quickstart-dotnet/local-webapp-net.png" border="true":::
+
+### [.NET Framework 4.8](#tab/netframework48)
- If you've installed Visual Studio 2019 already:
+1. Open Visual Studio and then select **Create a new project**.
+1. In **Create a new project**, find, and choose **ASP.NET Web Application (.NET Framework)**, then select **Next**.
+1. In **Configure your new project**, name the application _MyFirstAzureWebApp_, and then select **Create**.
- - Install the latest updates in Visual Studio by selecting **Help** > **Check for Updates**. The latest updates contain the .NET 5.0 SDK.
- - Add the workload by selecting **Tools** > **Get Tools and Features**.
+ :::image type="content" source="media/quickstart-dotnet/configure-webapp-netframework48.png" alt-text="Visual Studio - Configure ASP.NET Framework 4.8 web app." border="true":::
+1. Select the **MVC** template.
+1. Make sure **Authentication** is set to **No Authentication**. Select **Create**.
-## Create an ASP.NET Core web app
+ :::image type="content" source="media/quickstart-dotnet/vs-mvc-no-auth-netframework48.png" alt-text="Visual Studio - Select the MVC template." border="true":::
-Create an ASP.NET Core web app in Visual Studio by following these steps:
+1. From the Visual Studio menu, select **Debug** > **Start Without Debugging** to run the web app locally.
-# [.NET Core 3.1](#tab/netcore31)
+ :::image type="content" source="media/quickstart-dotnet/vs-local-webapp-netframework48.png" alt-text="Visual Studio - ASP.NET Framework 4.8 running locally." lightbox="media/quickstart-dotnet/vs-local-webapp-netframework48.png" border="true":::
-1. Open Visual Studio and select **Create a new project**.
++
-1. In **Create a new project**, select **ASP.NET Core Web Application** and confirm that **C#** is listed in the languages for that choice, then select **Next**.
-1. In **Configure your new project**, name your web application project *myFirstAzureWebApp*, and select **Create**.
+Create a new folder named _MyFirstAzureWebApp_, and open it in Visual Studio Code. Open the <a href="https://code.visualstudio.com/docs/editor/integrated-terminal" target="_blank">Terminal</a> window, and create a new .NET web app using the [`dotnet new webapp`](/dotnet/core/tools/dotnet-new#web-options) command.
- ![Configure your web app project](./media/quickstart-dotnetcore/configure-web-app-project.png)
+### [.NET Core 3.1](#tab/netcore31)
-1. You can deploy any type of ASP.NET Core web app to Azure, but for this quickstart, choose the **Web Application** template. Make sure **Authentication** is set to **No Authentication**, and that no other option is selected. Then, select **Create**.
+```dotnetcli
+dotnet new webapp -f netcoreapp3.1
+```
- ![Create a new ASP.NET Core web app](./media/quickstart-dotnetcore/create-aspnet-core-web-app.png)
-
-1. From the Visual Studio menu, select **Debug** > **Start Without Debugging** to run your web app locally.
+### [.NET 5.0](#tab/net50)
- ![Web app running locally](./media/quickstart-dotnetcore/web-app-running-locally.png)
+```dotnetcli
+dotnet new webapp -f net5.0
+```
+
+### [.NET Framework 4.8](#tab/netframework48)
+
+```dotnetcli
+dotnet new webapp --target-framework-override net48
+```
-# [.NET 5.0](#tab/net50)
+> [!IMPORTANT]
+> The `--target-framework-override` flag is a free-form text replacement of the target framework moniker (TFM) for the project, and makes *no guarantees* that the supporting template exists or compiles. You can only build and run .NET Framework apps on Windows.
+++
+From the **Terminal** in Visual Studio Code, run the application locally using the [`dotnet run`](/dotnet/core/tools/dotnet-run) command.
+
+```dotnetcli
+dotnet run
+```
-1. Open Visual Studio and select **Create a new project**.
+Open a web browser, and navigate to the app at `https://localhost:5001`.
-1. In **Create a new project**, select **ASP.NET Core Web Application** and confirm that **C#** is listed in the languages for that choice, then select **Next**.
-1. In **Configure your new project**, name your web application project *myFirstAzureWebApp*, and select **Create**.
+### [.NET Core 3.1](#tab/netcore31)
- ![Configure your web app project](./media/quickstart-dotnetcore/configure-web-app-project.png)
+You'll see the template ASP.NET Core 3.1 web app displayed in the page.
-1. For a .NET 5.0 app, select **ASP.NET Core 5.0** in the dropdown.
-1. You can deploy any type of ASP.NET Core web app to Azure, but for this quickstart, choose the **ASP.NET Core Web App** template. Make sure **Authentication** is set to **No Authentication**, and that no other option is selected. Then, select **Create**.
+### [.NET 5.0](#tab/net50)
- ![Create a new ASP.NET Core web app](./media/quickstart-dotnetcore/create-aspnet-core-web-app-5.png)
-
-1. From the Visual Studio menu, select **Debug** > **Start Without Debugging** to run your web app locally.
+You'll see the template ASP.NET Core 5.0 web app displayed in the page.
- ![Web app running locally](./media/quickstart-dotnetcore/web-app-running-locally.png)
+
+### [.NET Framework 4.8](#tab/netframework48)
+
+You'll see the template ASP.NET Framework 4.8 web app displayed in the page.
+ +
+<!-- markdownlint-disable MD044 -->
+<!-- markdownlint-enable MD044 -->
+
+Open a terminal window on your machine to a working directory. Create a new .NET web app using the [`dotnet new webapp`](/dotnet/core/tools/dotnet-new#web-options) command, and then change directories into the newly created app.
+
+### [.NET Core 3.1](#tab/netcore31)
+
+```dotnetcli
+dotnet new webapp -n MyFirstAzureWebApp -f netcoreapp3.1 && cd MyFirstAzureWebApp
+```
+
+### [.NET 5.0](#tab/net50)
+
+```dotnetcli
+dotnet new webapp -n MyFirstAzureWebApp -f net5.0 && cd MyFirstAzureWebApp
+```
+
+### [.NET Framework 4.8](#tab/netframework48)
+
+```dotnetcli
+dotnet new webapp -n MyFirstAzureWebApp --target-framework-override net48 && cd MyFirstAzureWebApp
+```
+
+> [!IMPORTANT]
+> The `--target-framework-override` flag is a free-form text replacement of the target framework moniker (TFM) for the project, and makes *no guarantees* that the supporting template exists or compiles. You can only build .NET Framework apps on Windows.
+++
+From the same terminal session, run the application locally using the [`dotnet run`](/dotnet/core/tools/dotnet-run) command.
+
+```dotnetcli
+dotnet run
+```
+
+Open a web browser, and navigate to the app at `https://localhost:5001`.
+
+### [.NET Core 3.1](#tab/netcore31)
+
+You'll see the template ASP.NET Core 3.1 web app displayed in the page.
++
+### [.NET 5.0](#tab/net50)
+
+You'll see the template ASP.NET Core 5.0 web app displayed in the page.
++
+### [.NET Framework 4.8](#tab/netframework48)
+
+You'll see the template ASP.NET Framework 4.8 web app displayed in the page.
+++++ ## Publish your web app
-To publish your web app, you must first create and configure a new App Service that you can publish your app to.
+To publish your web app, you must first create and configure a new App Service that you can publish your app to.
As part of setting up the App Service, you'll create: - A new [resource group](../azure-resource-manager/management/overview.md#terminology) to contain all of the Azure resources for the service.-- A new [Hosting Plan](./overview-hosting-plans.md) that specifies the location, size, and features of the web server farm that hosts your app.
+- A new [Hosting Plan](overview-hosting-plans.md) that specifies the location, size, and features of the web server farm that hosts your app.
Follow these steps to create your App Service and publish your web app:
-1. In **Solution Explorer**, right-click the **myFirstAzureWebApp** project and select **Publish**.
-1. In **Publish**, select **Azure** and click **Next**.
+1. In **Solution Explorer**, right-click the **MyFirstAzureWebApp** project and select **Publish**.
+1. In **Publish**, select **Azure** and then **Next**.
-1. Your options depend on whether you're signed in to Azure already and whether you have a Visual Studio account linked to an Azure account. Select either **Add an account** or **Sign in** to sign in to your Azure subscription. If you're already signed in, select the account you want.
+ :::image type="content" source="media/quickstart-dotnet/vs-publish-target-Azure.png" alt-text="Visual Studio - Publish the web app and target Azure." border="true":::
- ![Sign in to Azure](./media/quickstart-dotnetcore/sign-in-azure-vs2019.png)
+1. Your options depend on whether you're signed in to Azure already and whether you have a Visual Studio account linked to an Azure account. Select either **Add an account** or **Sign in** to sign in to your Azure subscription. If you're already signed in, select the account you want.
-1. To the right of **App Service instances**, click **+**.
+ :::image type="content" source="media/quickstart-dotnetcore/sign-in-Azure-vs2019.png" border="true" alt-text="Visual Studio - Select sign in to Azure dialog.":::
- ![New App Service app](./media/quickstart-dotnetcore/publish-new-app-service.png)
+1. Choose the **Specific target**, either **Azure App Service (Linux)** or **Azure App Service (Windows)**.
-1. For **Subscription**, accept the subscription that is listed or select a new one from the drop-down list.
+ > [!IMPORTANT]
+ > When targeting ASP.NET Framework 4.8, you will use **Azure App Service (Windows)**.
-1. For **Resource group**, select **New**. In **New resource group name**, enter *myResourceGroup* and select **OK**.
+1. To the right of **App Service instances**, select **+**.
-1. For **Hosting Plan**, select **New**.
+ :::image type="content" source="media/quickstart-dotnetcore/publish-new-app-service.png" border="true" alt-text="Visual Studio - New App Service app dialog.":::
+1. For **Subscription**, accept the subscription that is listed or select a new one from the drop-down list.
+1. For **Resource group**, select **New**. In **New resource group name**, enter *myResourceGroup* and select **OK**.
+1. For **Hosting Plan**, select **New**.
1. In the **Hosting Plan: Create new** dialog, enter the values specified in the following table:
- | Setting | Suggested Value | Description |
- | -- | | -- |
- | **Hosting Plan** | *myFirstAzureWebAppPlan* | Name of the App Service plan. |
- | **Location** | *West Europe* | The datacenter where the web app is hosted. |
- | **Size** | *Free* | [Pricing tier](https://azure.microsoft.com/pricing/details/app-service/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) determines hosting features. |
-
- ![Create new Hosting Plan](./media/quickstart-dotnetcore/create-new-hosting-plan-vs2019.png)
+ | Setting | Suggested value | Description |
+ ||--|--|
+ | **Hosting Plan** | *MyFirstAzureWebAppPlan* | Name of the App Service plan. |
+ | **Location** | *West Europe* | The datacenter where the web app is hosted. |
+ | **Size** | *Free* | [Pricing tier][app-service-pricing-tier] determines hosting features. |
-1. In **Name**, enter a unique app name that includes only the valid characters are `a-z`, `A-Z`, `0-9`, and `-`. You can accept the automatically generated unique name. The URL of the web app is `http://<app-name>.azurewebsites.net`, where `<app-name>` is your app name.
+ :::image type="content" source="media/quickstart-dotnetcore/create-new-hosting-plan-vs2019.png" border="true" alt-text="Create new Hosting Plan":::
-2. Select **Create** to create the Azure resources.
+1. In **Name**, enter a unique app name that includes only the valid characters are `a-z`, `A-Z`, `0-9`, and `-`. You can accept the automatically generated unique name. The URL of the web app is `http://<app-name>.azurewebsites.net`, where `<app-name>` is your app name.
+1. Select **Create** to create the Azure resources.
- ![Create app resources](./media/quickstart-dotnetcore/web-app-name-vs2019.png)
+ :::image type="content" source="media/quickstart-dotnetcore/web-app-name-vs2019.png" border="true" alt-text="Visual Studio - Create app resources dialog.":::
Once the wizard completes, the Azure resources are created for you and you are ready to publish.
-3. Select **Finish** to close the wizard.
+1. Select **Finish** to close the wizard.
+1. In the **Publish** page, select **Publish**. Visual Studio builds, packages, and publishes the app to Azure, and then launches the app in the default browser.
-1. In the **Publish** page, click **Publish**. Visual Studio builds, packages, and publishes the app to Azure, and then launches the app in the default browser.
+ ### [.NET Core 3.1](#tab/netcore31)
- ![Published ASP.NET web app running in Azure](./media/quickstart-dotnetcore/web-app-running-live.png)
+ You'll see the ASP.NET Core 3.1 web app displayed in the page.
-**Congratulations!** Your ASP.NET Core web app is running live in Azure App Service.
+ :::image type="content" source="media/quickstart-dotnet/Azure-webapp-net.png" lightbox="media/quickstart-dotnet/Azure-webapp-net.png" border="true" alt-text="Visual Studio - ASP.NET Core 3.1 web app in Azure.":::
-## Update the app and redeploy
+ ### [.NET 5.0](#tab/net50)
-Follow these steps to update and redeploy your web app:
+ You'll see the ASP.NET Core 5.0 web app displayed in the page.
-1. In **Solution Explorer**, under your project, open **Pages** > **Index.cshtml**.
+ :::image type="content" source="media/quickstart-dotnet/Azure-webapp-net.png" lightbox="media/quickstart-dotnet/Azure-webapp-net.png" border="true" alt-text="Visual Studio - ASP.NET Core 5.0 web app in Azure.":::
-1. Replace the entire `<div>` tag with the following code:
+ ### [.NET Framework 4.8](#tab/netframework48)
- ```html
- <div class="jumbotron">
- <h1>ASP.NET in Azure!</h1>
- <p class="lead">This is a simple app that we've built that demonstrates how to deploy a .NET app to Azure App Service.</p>
- </div>
- ```
+ You'll see the ASP.NET Framework 4.8 web app displayed in the page.
-1. To redeploy to Azure, right-click the **myFirstAzureWebApp** project in **Solution Explorer** and select **Publish**.
+ :::image type="content" source="media/quickstart-dotnet/vs-Azure-webapp-net48.png" lightbox="media/quickstart-dotnet/vs-Azure-webapp-net48.png" border="true" alt-text="Visual Studio - ASP.NET Framework 4.8 web app in Azure.":::
-1. In the **Publish** summary page, select **Publish**.
+
- <!-- ![Publish update to web app](./media/quickstart-dotnetcore/publish-update-to-web-app-vs2019.png) -->
- When publishing completes, Visual Studio launches a browser to the URL of the web app.
- ![Updated ASP.NET web app running in Azure](./media/quickstart-dotnetcore/updated-web-app-running-live.png)
+To deploy your web app using the Visual Studio Azure Tools extension:
-## Manage the Azure app
+1. In Visual Studio Code, open the [**Command Palette**](https://code.visualstudio.com/docs/getstarted/userinterface#_command-palette), <kbd>Ctrl</kbd>+<kbd>Shift</kbd>+<kbd>P</kbd>.
+1. Search for and select "Azure App Service: Deploy to Web App".
+1. Respond to the prompts as follows:
-To manage your web app, go to the [Azure portal](https://portal.azure.com), and search for and select **App Services**.
+ - Select *MyFirstAzureWebApp* as the folder to deploy.
+ - Select **Add Config** when prompted.
+ - If prompted, sign in to your existing Azure account.
-![Select App Services](./media/quickstart-dotnetcore/app-services.png)
+ :::image type="content" source="media/quickstart-dotnet/vscode-sign-in-to-Azure.png" alt-text="Visual Studio Code - Sign in to Azure." border="true":::
-On the **App Services** page, select the name of your web app.
+ - Select your **Subscription**.
+ - Select **Create new Web App... Advanced**.
+ - For **Enter a globally unique name**, use a name that's unique across all of Azure (*valid characters are `a-z`, `0-9`, and `-`*). A good pattern is to use a combination of your company name and an app identifier.
+ - Select **Create new resource group** and provide a name like `myResourceGroup`.
+ - When prompted to **Select a runtime stack**:
+ - For *.NET Core 3.1*, select **.NET Core 3.1 (LTS)**
+ - For *.NET 5.0*, select **.NET 5**
+ - For *.NET Framework 4.8*, select **ASP.NET V4.8**
+ - Select an operating system (Windows or Linux).
+ - For *.NET Framework 4.8*, Windows will be selected implicitly.
+ - Select **Create a new App Service plan**, provide a name, and select the **F1 Free** [pricing tier][app-service-pricing-tier].
+ - Select **Skip for now** for the Application Insights resource.
+ - Select a location near you.
+1. When publishing completes, select **Browse Website** in the notification and select **Open** when prompted.
-The **Overview** page for your web app, contains options for basic management like browse, stop, start, restart, and delete. The left menu provides further pages for configuring your app.
+ ### [.NET Core 3.1](#tab/netcore31)
-![App Service in Azure portal](./media/quickstart-dotnetcore/web-app-overview-page.png)
+ You'll see the ASP.NET Core 3.1 web app displayed in the page.
+ :::image type="content" source="media/quickstart-dotnet/Azure-webapp-net.png" lightbox="media/quickstart-dotnet/Azure-webapp-net.png" border="true" alt-text="Visual Studio Code - ASP.NET Core 3.1 web app in Azure.":::
-## Next steps
+ ### [.NET 5.0](#tab/net50)
-In this quickstart, you used Visual Studio to create and deploy an ASP.NET Core web app to Azure App Service.
+ You'll see the ASP.NET Core 5.0 web app displayed in the page.
-Advance to the next article to learn how to create a .NET Core app and connect it to a SQL Database:
+ :::image type="content" source="media/quickstart-dotnet/Azure-webapp-net.png" lightbox="media/quickstart-dotnet/Azure-webapp-net.png" border="true" alt-text="Visual Studio Code - ASP.NET Core 5.0 web app in Azure.":::
-> [!div class="nextstepaction"]
-> [ASP.NET Core with SQL Database](tutorial-dotnetcore-sqldb-app.md)
+ ### [.NET Framework 4.8](#tab/netframework48)
-> [!div class="nextstepaction"]
-> [Configure ASP.NET Core app](configure-language-dotnetcore.md)
+ You'll see the ASP.NET Framework 4.8 web app displayed in the page.
+
+ :::image type="content" source="media/quickstart-dotnet/Azure-webapp-net48.png" lightbox="media/quickstart-dotnet/vs-Azure-webapp-net48.png" border="true" alt-text="Visual Studio Code - ASP.NET Framework 4.8 web app in Azure.":::
+
+
-[App Service on Linux](overview.md#app-service-on-linux) provides a highly scalable, self-patching web hosting service using the Linux operating system. This quickstart shows how to create a [.NET Core](/aspnet/core/) app and deploy to a Linux hosted App Service using the [Azure CLI](/cli/azure/get-started-with-azure-cli).
+<!-- markdownlint-disable MD044 -->
+<!-- markdownlint-enable MD044 -->
-![Sample app running in Azure](media/quickstart-dotnetcore/dotnet-browse-azure.png)
+Deploy the code in your local *MyFirstAzureWebApp* directory using the [`az webapp up`](/cli/azure/webapp#az_webapp_up) command:
-You can follow the steps in this article using a Mac, Windows, or Linux machine.
+```azurecli
+az webapp up --sku F1 --name <app-name> --os-type <os>
+```
+- If the `az` command isn't recognized, be sure you have the Azure CLI installed as described in [Prerequisites](#prerequisites).
+- Replace `<app-name>` with a name that's unique across all of Azure (*valid characters are `a-z`, `0-9`, and `-`*). A good pattern is to use a combination of your company name and an app identifier.
+- The `--sku F1` argument creates the web app on the **Free** [pricing tier][app-service-pricing-tier]. Omit this argument to use a faster premium tier, which incurs an hourly cost.
+- Replace `<os>` with either `linux` or `windows`. You must use `windows` when targeting *ASP.NET Framework 4.8*.
+- You can optionally include the argument `--location <location-name>` where `<location-name>` is an available Azure region. You can retrieve a list of allowable regions for your Azure account by running the [`az account list-locations`](/cli/azure/appservice#az-appservice-list-locations) command.
-## Set up your initial environment
+The command may take a few minutes to complete. While running, it provides messages about creating the resource group, the App Service plan, and hosting app, configuring logging, then performing ZIP deployment. It then outputs a message with the app's URL:
-# [.NET Core 3.1](#tab/netcore31)
+```azurecli
+You can launch the app at http://<app-name>.azurewebsites.net
+```
-To complete this quickstart:
+Open a web browser and navigate to the URL:
-* <a href="https://dotnet.microsoft.com/download/dotnet-core/3.1" target="_blank">Install the latest .NET Core 3.1 SDK</a>.
-* <a href="/cli/azure/install-azure-cli" target="_blank">Install the latest Azure CLI</a>.
+### [.NET Core 3.1](#tab/netcore31)
-# [.NET 5.0](#tab/net50)
+You'll see the ASP.NET Core 3.1 web app displayed in the page.
-To complete this quickstart:
-* <a href="https://dotnet.microsoft.com/download/dotnet/5.0" target="_blank">Install the latest .NET 5.0 SDK</a>.
-* <a href="/cli/azure/install-azure-cli" target="_blank">Install the latest Azure CLI</a>.
+### [.NET 5.0](#tab/net50)
+
+You'll see the ASP.NET Core 5.0 web app displayed in the page.
++
+### [.NET Framework 4.8](#tab/netframework48)
+
+You'll see the ASP.NET Framework 4.8 web app displayed in the page.
+
-[Having issues? Let us know.](https://aka.ms/DotNetAppServiceLinuxQuickStart)
-## Create the app locally
+## Update the app and redeploy
-In a terminal window on your machine, create a directory named `hellodotnetcore` and change the current directory to it.
+Follow these steps to update and redeploy your web app:
-```bash
-mkdir hellodotnetcore
-cd hellodotnetcore
-```
-Create a new .NET Core app.
+1. In **Solution Explorer**, under your project, open *Index.cshtml*.
+1. Replace the first `<div>` element with the following code:
-```bash
-dotnet new web
-```
+ ```razor
+ <div class="jumbotron">
+ <h1>.NET 💜 Azure</h1>
+ <p class="lead">Example .NET app to Azure App Service.</p>
+ </div>
+ ```
-## Run the app locally
+ Save your changes.
-Run the application locally so that you see how it should look when you deploy it to Azure.
+1. To redeploy to Azure, right-click the **MyFirstAzureWebApp** project in **Solution Explorer** and select **Publish**.
+1. In the **Publish** summary page, select **Publish**.
-```bash
-dotnet run
-```
+ When publishing completes, Visual Studio launches a browser to the URL of the web app.
-Open a web browser, and navigate to the app at `http://localhost:5000`.
+ ### [.NET Core 3.1](#tab/netcore31)
-You see the **Hello World** message from the sample app displayed in the page.
+ You'll see the updated ASP.NET Core 3.1 web app displayed in the page.
-![Test with browser](media/quickstart-dotnetcore/dotnet-browse-local.png)
+ :::image type="content" source="media/quickstart-dotnet/updated-Azure-webapp-net.png" lightbox="media/quickstart-dotnet/updated-Azure-webapp-net.png" border="true" alt-text="Visual Studio - Updated ASP.NET Core 3.1 web app in Azure.":::
-[Having issues? Let us know.](https://aka.ms/DotNetAppServiceLinuxQuickStart)
+ ### [.NET 5.0](#tab/net50)
-## Sign into Azure
-In your terminal window, log into Azure with the following command:
+ You'll see the updated ASP.NET Core 5.0 web app displayed in the page.
-```azurecli
-az login
-```
+ :::image type="content" source="media/quickstart-dotnet/updated-Azure-webapp-net.png" lightbox="media/quickstart-dotnet/updated-Azure-webapp-net.png" border="true" alt-text="Visual Studio - Updated ASP.NET Core 5.0 web app in Azure.":::
-## Deploy the app
+ ### [.NET Framework 4.8](#tab/netframework48)
-Deploy the code in your local folder (*hellodotnetcore*) using the `az webapp up` command:
+ You'll see the updated ASP.NET Framework 4.8 web app displayed in the page.
-```azurecli
-az webapp up --sku F1 --name <app-name>
-```
+ :::image type="content" source="media/quickstart-dotnet/vs-updated-Azure-webapp-net48.png" lightbox="media/quickstart-dotnet/vs-updated-Azure-webapp-net48.png" border="true" alt-text="Visual Studio - Updated ASP.NET Framework 4.8 web app in Azure.":::
-- If the `az` command isn't recognized, be sure you have the Azure CLI installed as described in [Set up your initial environment](#set-up-your-initial-environment).-- Replace `<app-name>` with a name that's unique across all of Azure (*valid characters are `a-z`, `0-9`, and `-`*). A good pattern is to use a combination of your company name and an app identifier.-- The `--sku F1` argument creates the web app on the Free pricing tier. Omit this argument to use a faster premium tier, which incurs an hourly cost.-- You can optionally include the argument `--location <location-name>` where `<location-name>` is an available Azure region. You can retrieve a list of allowable regions for your Azure account by running the [`az account list-locations`](/cli/azure/appservice#az-appservice-list-locations) command.
+
-The command may take a few minutes to complete. While running, it provides messages about creating the resource group, the App Service plan and hosting app, configuring logging, then performing ZIP deployment. It then gives the message, "You can launch the app at http://&lt;app-name&gt;.azurewebsites.net", which is the app's URL on Azure.
-# [.NET Core 3.1](#tab/netcore31)
-![Example output of the az webapp up command](./media/quickstart-dotnetcore/az-webapp-up-output-3.1.png)
+1. Open *Index.cshtml*.
+1. Replace the first `<div>` element with the following code:
-# [.NET 5.0](#tab/net50)
+ ```razor
+ <div class="jumbotron">
+ <h1>.NET 💜 Azure</h1>
+ <p class="lead">Example .NET app to Azure App Service.</p>
+ </div>
+ ```
-<!-- Deploy the code in your local folder (*hellodotnetcore*) using the `az webapp up` command:
+ Save your changes.
-```azurecli
-az webapp up --sku B1 --name <app-name> --os-type linux
-```
+1. Open the Visual Studio Code **Side Bar**, select the **Azure** icon to expand its options.
+1. Under the **APP SERVICE** node, expand your subscription and right-click on the **MyFirstAzureWebApp**.
+1. Select the **Deploy to Web App...**.
+1. Select **Deploy** when prompted.
+1. When publishing completes, select **Browse Website** in the notification and select **Open** when prompted.
-- If the `az` command isn't recognized, be sure you have the Azure CLI installed as described in [Set up your initial environment](#set-up-your-initial-environment).-- Replace `<app-name>` with a name that's unique across all of Azure (*valid characters are `a-z`, `0-9`, and `-`*). A good pattern is to use a combination of your company name and an app identifier.-- The `--sku B1` argument creates the web app in the Basic pricing tier, which incurs an hourly cost. Omit this argument to use a faster premium tier, which costs more.-- You can optionally include the argument `--location <location-name>` where `<location-name>` is an available Azure region. You can retrieve a list of allowable regions for your Azure account by running the [`az account list-locations`](/cli/azure/appservice#az-appservice-list-locations) command.
+ ### [.NET Core 3.1](#tab/netcore31)
-The command may take a few minutes to complete. While running, it provides messages about creating the resource group, the App Service plan and hosting app, configuring logging, then performing ZIP deployment. It then gives the message, "You can launch the app at http://&lt;app-name&gt;.azurewebsites.net", which is the app's URL on Azure. -->
+ You'll see the updated ASP.NET Core 3.1 web app displayed in the page.
-![Example output of the az webapp up command](./media/quickstart-dotnetcore/az-webapp-up-output-5.0.png)
+ :::image type="content" source="media/quickstart-dotnet/updated-Azure-webapp-net.png" lightbox="media/quickstart-dotnet/updated-Azure-webapp-net.png" border="true" alt-text="Visual Studio Code - Updated ASP.NET Core 3.1 web app in Azure.":::
-
+ ### [.NET 5.0](#tab/net50)
-[Having issues? Let us know.](https://aka.ms/DotNetAppServiceLinuxQuickStart)
+ You'll see the updated ASP.NET Core 5.0 web app displayed in the page.
+ :::image type="content" source="media/quickstart-dotnet/updated-Azure-webapp-net.png" lightbox="media/quickstart-dotnet/updated-Azure-webapp-net.png" border="true" alt-text="Visual Studio Code - Updated ASP.NET Core 5.0 web app in Azure.":::
-## Browse to the app
+ ### [.NET Framework 4.8](#tab/netframework48)
-Browse to the deployed application using your web browser.
+ You'll see the updated ASP.NET Framework 4.8 web app displayed in the page.
-```bash
-http://<app_name>.azurewebsites.net
+ :::image type="content" source="media/quickstart-dotnet/updated-Azure-webapp-net48.png" lightbox="media/quickstart-dotnet/updated-Azure-webapp-net48.png" border="true" alt-text="Visual Studio Code - Updated ASP.NET Framework 4.8 web app in Azure.":::
+
+
++
+<!-- markdownlint-disable MD044 -->
+<!-- markdownlint-enable MD044 -->
+
+In the local directory, open the *Index.cshtml* file. Replace the first `<div>` element:
+
+```razor
+<div class="jumbotron">
+ <h1>.NET 💜 Azure</h1>
+ <p class="lead">Example .NET app to Azure App Service.</p>
+</div>
```
-The .NET Core sample code is running in App Service on Linux with a built-in image.
+Save your changes, then redeploy the app using the `az webapp up` command again:
-![Sample app running in Azure](media/quickstart-dotnetcore/dotnet-browse-azure.png)
+### [.NET Core 3.1](#tab/netcore31)
-**Congratulations!** You've deployed your first .NET Core app to App Service on Linux.
+ASP.NET Core 3.1 is cross-platform, based on your previous deployment replace `<os>` with either `linux` or `windows`.
-[Having issues? Let us know.](https://aka.ms/DotNetAppServiceLinuxQuickStart)
+```azurecli
+az webapp up --os-type <os>
+```
-## Update and redeploy the code
+### [.NET 5.0](#tab/net50)
-In the local directory, open the _Startup.cs_ file. Make a small change to the text in the method call `context.Response.WriteAsync`:
+ASP.NET Core 5.0 is cross-platform, based on your previous deployment replace `<os>` with either `linux` or `windows`.
-```csharp
-await context.Response.WriteAsync("Hello Azure!");
+```azurecli
+az webapp up --os-type <os>
```
-Save your changes, then redeploy the app using the `az webapp up` command again:
+### [.NET Framework 4.8](#tab/netframework48)
+
+ASP.NET Framework 4.8 has framework dependencies, and must be hosted on Windows.
```azurecli
-az webapp up --os-type linux
+az webapp up --os-type windows
```
+> [!TIP]
+> If you're interested in hosting your .NET apps on Linux, consider migrating from [ASP.NET Framework to ASP.NET Core](/aspnet/core/migration/proper-to-2x).
+++ This command uses values that are cached locally in the *.azure/config* file, including the app name, resource group, and App Service plan. Once deployment has completed, switch back to the browser window that opened in the **Browse to the app** step, and hit refresh.
-![Updated sample app running in Azure](media/quickstart-dotnetcore/dotnet-browse-azure-updated.png)
+### [.NET Core 3.1](#tab/netcore31)
+
+You'll see the updated ASP.NET Core 3.1 web app displayed in the page.
++
+### [.NET 5.0](#tab/net50)
+
+You'll see the updated ASP.NET Core 5.0 web app displayed in the page.
++
+### [.NET Framework 4.8](#tab/netframework48)
+
+You'll see the updated ASP.NET Framework 4.8 web app displayed in the page.
++++
-[Having issues? Let us know.](https://aka.ms/DotNetAppServiceLinuxQuickStart)
+## Manage the Azure app
-## Manage your new Azure app
+To manage your web app, go to the [Azure portal](https://portal.azure.com), and search for and select **App Services**.
-Go to the <a href="https://portal.azure.com" target="_blank">Azure portal</a> to manage the app you created.
-From the left menu, click **App Services**, and then click the name of your Azure app.
+On the **App Services** page, select the name of your web app.
-You see your app's Overview page. Here, you can perform basic management tasks like browse, stop, start, restart, and delete.
+The **Overview** page for your web app, contains options for basic management like browse, stop, start, restart, and delete. The left menu provides further pages for configuring your app.
-![App Service page in Azure portal](media/quickstart-dotnetcore/portal-app-overview-up.png)
-The left menu provides different pages for configuring your app.
+<!--
+## Clean up resources - H2 added from the next three includes
+-->
-[Having issues? Let us know.](https://aka.ms/DotNetAppServiceLinuxQuickStart)
+<!-- markdownlint-disable MD044 -->
+<!-- markdownlint-enable MD044 -->
## Next steps
+In this quickstart, you created and deployed an ASP.NET web app to Azure App Service.
+
+### [.NET Core 3.1](#tab/netcore31)
+
+Advance to the next article to learn how to create a .NET Core app and connect it to a SQL Database:
+
+> [!div class="nextstepaction"]
+> [Tutorial: ASP.NET Core app with SQL database](tutorial-dotnetcore-sqldb-app.md)
+
+> [!div class="nextstepaction"]
+> [Configure ASP.NET Core 3.1 app](configure-language-dotnetcore.md)
+
+### [.NET 5.0](#tab/net50)
+
+Advance to the next article to learn how to create a .NET Core app and connect it to a SQL Database:
+
+> [!div class="nextstepaction"]
+> [Tutorial: ASP.NET Core app with SQL database](tutorial-dotnetcore-sqldb-app.md)
+ > [!div class="nextstepaction"]
-> [Tutorial: ASP.NET Core app with SQL Database](tutorial-dotnetcore-sqldb-app.md)
+> [Configure ASP.NET Core 5.0 app](configure-language-dotnetcore.md)
+
+### [.NET Framework 4.8](#tab/netframework48)
+
+Advance to the next article to learn how to create a .NET Framework app and connect it to a SQL Database:
+
+> [!div class="nextstepaction"]
+> [Tutorial: ASP.NET app with SQL database](app-service-web-tutorial-dotnet-sqldatabase.md)
> [!div class="nextstepaction"]
-> [Configure ASP.NET Core app](configure-language-dotnetcore.md)
+> [Configure ASP.NET Framework app](configure-language-dotnet-framework.md)
++
+[app-service-pricing-tier]: https://azure.microsoft.com/pricing/details/app-service/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio
app-service Tutorial Java Spring Cosmosdb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/tutorial-java-spring-cosmosdb.md
Open the `pom.xml` file in the `initial/spring-boot-todo` directory and add the
<plugin> <groupId>com.microsoft.azure</groupId> <artifactId>azure-webapp-maven-plugin</artifactId>
- <version>1.11.0</version>
+ <version>1.13.0</version>
<configuration> <schemaVersion>v2</schemaVersion>
bash-3.2$ mvn azure-webapp:deploy
The output contains the URL to your deployed application (in this example, `https://spring-todo-app.azurewebsites.net` ). You can copy this URL into your web browser or run the following command in your Terminal window to load your app. ```bash
-curl https://spring-todo-app.azurewebsites.net
+explorer https://spring-todo-app.azurewebsites.net
``` You should see the app running with the remote URL in the address bar:
application-gateway Ingress Controller Letsencrypt Certificate Application Gateway https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/ingress-controller-letsencrypt-certificate-application-gateway.md
Follow the steps below to install [cert-manager](https://docs.cert-manager.io) o
helm repo update # Install the cert-manager Helm chart
+ # Helm v3+
+ helm install \
+ cert-manager jetstack/cert-manager \
+ --namespace cert-manager \
+ --version v1.0.4 \
+ # --set installCRDs=true
+
+ # Helm v2
helm install \ --name cert-manager \ --namespace cert-manager \
- --version v0.8.0 \
- jetstack/cert-manager
+ --version v1.0.4 \
+ jetstack/cert-manager \
+ # --set installCRDs=true
+
+ #To automatically install and manage the CRDs as part of your Helm release,
+ # you must add the --set installCRDs=true flag to your Helm installation command.
``` 2. ClusterIssuer Resource
automation Quickstart Create Automation Account Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/quickstart-create-automation-account-template.md
Title: "Quickstart: Create an Automation account - Azure template"
description: This quickstart shows how to create an Automation account by using the Azure Resource Manager template.
-Customer intent: I want to create an Automation account by using an Azure Resource Manager template so that I can automate processes with runbooks.
+# Customer intent: I want to create an Automation account by using an Azure Resource Manager template so that I can automate processes with runbooks.
Last updated 01/07/2021
automation Schedules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/shared-resources/schedules.md
Title: Manage schedules in Azure Automation
description: This article tells how to create and work with a schedule in Azure Automation. Previously updated : 03/19/2021 Last updated : 03/29/2021
$automationAccountName = "MyAutomationAccount"
$runbookName = "Test-Runbook" $scheduleName = "Sample-DailySchedule" $params = @{"FirstName"="Joe";"LastName"="Smith";"RepeatCount"=2;"Show"=$true}
-Register-AzAutomationScheduledRunbook ΓÇôAutomationAccountName $automationAccountName `
-ΓÇôName $runbookName ΓÇôScheduleName $scheduleName ΓÇôParameters $params `
+Register-AzAutomationScheduledRunbook -AutomationAccountName $automationAccountName `
+-Name $runbookName -ScheduleName $scheduleName -Parameters $params `
-ResourceGroupName "ResourceGroup01" ```
The following example shows how to disable a schedule for a runbook by using an
```azurepowershell-interactive $automationAccountName = "MyAutomationAccount" $scheduleName = "Sample-MonthlyDaysOfMonthSchedule"
-Set-AzAutomationSchedule ΓÇôAutomationAccountName $automationAccountName `
-ΓÇôName $scheduleName ΓÇôIsEnabled $false -ResourceGroupName "ResourceGroup01"
+Set-AzAutomationSchedule -AutomationAccountName $automationAccountName `
+-Name $scheduleName -IsEnabled $false -ResourceGroupName "ResourceGroup01"
``` ## Remove a schedule
availability-zones Az Region https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/availability-zones/az-region.md
description: To create highly available and resilient applications in Azure, Ava
Previously updated : 03/16/2021 Last updated : 03/30/2021
To achieve comprehensive business continuity on Azure, build your application ar
| Americas | Europe | Africa | Asia Pacific | |--|-||-| | | | | |
-| Brazil South | France Central | South Africa North* | Japan East |
-| Canada Central | Germany West Central | | Southeast Asia |
-| Central US | North Europe | | Australia East |
-| East US | UK South | | |
+| Brazil South | France Central | South Africa North* | Australia East |
+| Canada Central | Germany West Central | | Japan East |
+| Central US | North Europe | | Korea Central* |
+| East US | UK South | | Southeast Asia |
| East US 2 | West Europe | | | | South Central US | | | |
-| US Gov Virginia | | | |
+| US Gov Virginia | | | |
| West US 2 | | | |-
+| West US 3* | | | |
\* To learn more about Availability Zones and available services support in these regions, contact your Microsoft sales or customer representative. For the upcoming regions that will support Availability Zones, see [Azure geographies](https://azure.microsoft.com/en-us/global-infrastructure/geographies/).
azure-app-configuration Pull Key Value Devops Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/pull-key-value-devops-pipeline.md
A [service connection](/azure/devops/pipelines/library/service-endpoints) allows
1. Under **Pipelines** select **Service connections**. 1. If you don't have any existing service connections, click the **Create service connection** button in the middle of the screen. Otherwise, click **New service connection** in the top right of the page. 1. Select **Azure Resource Manager**.
-1. Select **Service principal (automatic)**.
+![Screenshot shows selecting Azure Resource Manager from the New service connection dropdown list.](./media/new-service-connection.png)
+1. In the **Authentication method** dialog, select **Service principal (automatic)**.
+ > [!NOTE]
+ > **Managed identity** authentication is currently unsupported for the App Configuration task.
1. Fill in your subscription and resource. Give your service connection a name. Now that your service connection is created, find the name of the service principal assigned to it. You'll add a new role assignment to this service principal in the next step.
Assign the proper App Configuration role to the service connection being used wi
1. Navigate to your target App Configuration store. For a walkthrough of setting up an App Configuration store, see [Create an App Configuration store](./quickstart-dotnet-core-app.md#create-an-app-configuration-store) in one of the Azure App Configuration quickstarts. 1. On the left, select **Access control (IAM)**.
-1. At the top, select **+ Add** and pick **Add role assignment**.
+1. On the right side, click the **Add role assignments** button.
+![Screenshot shows the Add role assignments button.](./media/add-role-assignment-button.png).
1. Under **Role**, select **App Configuration Data Reader**. This role allows the task to read from the App Configuration store. 1. Select the service principal associated with the service connection that you created in the previous section.
+![Screenshot shows the Add role assignment dialog.](./media/add-role-assignment-reader.png)
> [!NOTE] > To resolve Azure Key Vault references within App Configuration, the service connection must also be granted permission to read secrets in the referenced Azure Key Vaults.
Assign the proper App Configuration role to the service connection being used wi
This section will cover how to use the Azure App Configuration task in an Azure DevOps build pipeline. 1. Navigate to the build pipeline page by clicking **Pipelines** > **Pipelines**. For build pipeline documentation, see [Create your first pipeline](/azure/devops/pipelines/create-first-pipeline?tabs=net%2Ctfs-2018-2%2Cbrowser).
- - If you're creating a new build pipeline, click **New pipeline**, select the repository for your pipeline. Select **Show assistant** on the right side of the pipeline, and search for the **Azure App Configuration** task.
- - If you're using an existing build pipeline, select **Edit** to edit the pipeline. In the **Tasks** tab, search for the **Azure App Configuration** Task.
+ - If you're creating a new build pipeline, on the last step of the process, on the **Review** tab, select **Show assistant** on the right side of the pipeline.
+ ![Screenshot shows the Show assistant button for a new pipeline.](./media/new-pipeline-show-assistant.png)
+ - If you're using an existing build pipeline, click the **Edit** button at the top-right.
+ ![Screenshot shows the Edit button for an existing pipeline.](./media/existing-pipeline-show-assistant.png)
+1. Search for the **Azure App Configuration** Task.
+![Screenshot shows the Add Task dialog with Azure App Configuration in the search box.](./media/add-azure-app-configuration-task.png)
1. Configure the necessary parameters for the task to pull the key-values from the App Configuration store. Descriptions of the parameters are available in the **Parameters** section below and in tooltips next to each parameter. - Set the **Azure subscription** parameter to the name of the service connection you created in a previous step. - Set the **App Configuration name** to the resource name of your App Configuration store. - Leave the default values for the remaining parameters.
+![Screenshot shows the app configuration task parameters.](./media/azure-app-configuration-parameters.png)
1. Save and queue a build. The build log will display any failures that occurred during the execution of the task. ## Use in releases
This section will cover how to use the Azure App Configuration task in an Azure
1. Navigate to release pipeline page by selecting **Pipelines** > **Releases**. For release pipeline documentation, see [Release pipelines](/azure/devops/pipelines/release). 1. Choose an existing release pipeline. If you donΓÇÖt have one, click **New pipeline** to create a new one. 1. Select the **Edit** button in the top-right corner to edit the release pipeline.
-1. Choose the **Stage** to add the task. For more information about stages, see [Add stages, dependencies, & conditions](/azure/devops/pipelines/release/environments).
-1. Click **+** for on "Run on agent", then add the **Azure App Configuration** task under the **Add tasks** tab.
+1. From the **Tasks** dropdown, choose the **Stage** to which you want to add the task. More information about stages can be found [here](/azure/devops/pipelines/release/environments).
+![Screenshot shows the selected stage in the Tasks dropdown.](./media/pipeline-stage-tasks.png)
+1. Click **+** next to the Job to which you want to add a new task.
+![Screenshot shows the plus button next to the job.](./media/add-task-to-job.png)
+1. Search for the **Azure App Configuration** Task.
+![Screenshot shows the Add Task dialog with Azure App Configuration in the search box.](./media/add-azure-app-configuration-task.png)
1. Configure the necessary parameters within the task to pull your key-values from your App Configuration store. Descriptions of the parameters are available in the **Parameters** section below and in tooltips next to each parameter. - Set the **Azure subscription** parameter to the name of the service connection you created in a previous step. - Set the **App Configuration name** to the resource name of your App Configuration store.
azure-app-configuration Push Kv Devops Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/push-kv-devops-pipeline.md
A [service connection](/azure/devops/pipelines/library/service-endpoints) allows
1. In Azure DevOps, go to the project containing your target pipeline and open the **Project settings** at the bottom left. 1. Under **Pipelines** select **Service connections** and select **New service connection** in the top right. 1. Select **Azure Resource Manager**.
-1. Select **Service principal (automatic)**.
+![Screenshot shows selecting Azure Resource Manager from the New service connection dropdown list.](./media/new-service-connection.png)
+1. In the **Authentication method** dialog, select **Service principal (automatic)**.
+ > [!NOTE]
+ > **Managed identity** authentication is currently unsupported for the App Configuration task.
1. Fill in your subscription and resource. Give your service connection a name. Now that your service connection is created, find the name of the service principal assigned to it. You'll add a new role assignment to this service principal in the next step.
Now that your service connection is created, find the name of the service princi
1. Select the service connection that you created in the previous section. 1. Select **Manage Service Principal**. 1. Note the **Display name** listed.
+![Screenshot shows the service principal display name.](./media/service-principal-display-name.png)
## Add role assignment
Assign the proper App Configuration role assignments to the credentials being us
1. Navigate to your target App Configuration store. 1. On the left, select **Access control (IAM)**.
-1. At the top, select **+ Add** and pick **Add role assignment**.
+1. On the right side, click the **Add role assignments** button.
+![Screenshot shows the Add role assignments button.](./media/add-role-assignment-button.png)
1. Under **Role**, select **App Configuration Data Owner**. This role allows the task to read from and write to the App Configuration store. 1. Select the service principal associated with the service connection that you created in the previous section.
+![Screenshot shows the Add role assignment dialog.](./media/add-role-assignment.png)
+ ## Use in builds This section will cover how to use the Azure App Configuration Push task in an Azure DevOps build pipeline. 1. Navigate to the build pipeline page by clicking **Pipelines** > **Pipelines**. Documentation for build pipelines can be found [here](/azure/devops/pipelines/create-first-pipeline?tabs=tfs-2018-2).
- - If you're creating a new build pipeline, select **Show assistant** on the right side of the pipeline, and search for the **Azure App Configuration Push** task.
- - If you're using an existing build pipeline, navigate to the **Tasks** tab when editing the pipeline, and search for the **Azure App Configuration Push** Task.
-2. Configure the necessary parameters for the task to push the key-values from the configuration file to the App Configuration store. The **Configuration File Path** parameter begins at the root of the file repository.
-3. Save and queue a build. The build log will display any failures that occurred during the execution of the task.
+ - If you're creating a new build pipeline, on the last step of the process, on the **Review** tab, select **Show assistant** on the right side of the pipeline.
+ ![Screenshot shows the Show assistant button for a new pipeline.](./media/new-pipeline-show-assistant.png)
+ - If you're using an existing build pipeline, click the **Edit** button at the top-right.
+ ![Screenshot shows the Edit button for an existing pipeline.](./media/existing-pipeline-show-assistant.png)
+1. Search for the **Azure App Configuration Push** Task.
+![Screenshot shows the Add Task dialog with Azure App Configuration Push in the search box.](./media/add-azure-app-configuration-push-task.png)
+1. Configure the necessary parameters for the task to push the key-values from the configuration file to the App Configuration store. Explanations of the parameters are available in the **Parameters** section below, and in tooltips next to each parameter.
+![Screenshot shows the app configuration push task parameters.](./media/azure-app-configuration-push-parameters.png)
+1. Save and queue a build. The build log will display any failures that occurred during the execution of the task.
## Use in releases
This section will cover how to use the Azure App Configuration Push task in an A
1. Navigate to release pipeline page by selecting **Pipelines** > **Releases**. Documentation for release pipelines can be found [here](/azure/devops/pipelines/release). 1. Choose an existing release pipeline. If you donΓÇÖt have one, select **+ New** to create a new one. 1. Select the **Edit** button in the top-right corner to edit the release pipeline.
-1. Choose the **Stage** to add the task. More information about stages can be found [here](/azure/devops/pipelines/release/environments).
-1. Select **+** for that Job, then add the **Azure App Configuration Push** task under the **Deploy** tab.
+1. From the **Tasks** dropdown, choose the **Stage** to which you want to add the task. More information about stages can be found [here](/azure/devops/pipelines/release/environments).
+![Screenshot shows the selected stage in the Tasks dropdown.](./media/pipeline-stage-tasks.png)
+1. Click **+** next to the Job to which you want to add a new task.
+![Screenshot shows the plus button next to the job.](./media/add-task-to-job.png)
+1. In the **Add tasks** dialog, type **Azure App Configuration Push** into the search box and select it.
1. Configure the necessary parameters within the task to push your key-values from your configuration file to your App Configuration store. Explanations of the parameters are available in the **Parameters** section below, and in tooltips next to each parameter. 1. Save and queue a release. The release log will display any failures encountered during the execution of the task.
The following parameters are used by the App Configuration Push task:
- **Azure subscription**: A drop-down containing your available Azure service connections. To update and refresh your list of available Azure service connections, press the **Refresh Azure subscription** button to the right of the textbox. - **App Configuration Name**: A drop-down that loads your available configuration stores under the selected subscription. To update and refresh your list of available configuration stores, press the **Refresh App Configuration Name** button to the right of the textbox.-- **Configuration File Path**: The path to your configuration file. You can browse through your build artifact to select a configuration file. (`...` button to the right of the textbox). The supported file formats are: yaml, json, properties.
+- **Configuration File Path**: The path to your configuration file. The **Configuration File Path** parameter begins at the root of the file repository. You can browse through your build artifact to select a configuration file. (`...` button to the right of the textbox). The supported file formats are: yaml, json, properties. The following is an example configuration file in json format.
+ ```json
+ {
+ "TestApp:Settings:BackgroundColor":"#FFF",
+ "TestApp:Settings:FontColor":"#000",
+ "TestApp:Settings:FontSize":"24",
+ "TestApp:Settings:Message": "Message data"
+ }
+ ```
- **Separator**: The separator that's used to flatten .json and .yml files. - **Depth**: The depth that the .json and .yml files will be flattened to. - **Prefix**: A string that's appended to the beginning of each key pushed to the App Configuration store.
The following parameters are used by the App Configuration Push task:
- **Checked**: Removes all key-values in the App Configuration store that match both the specified prefix and label before pushing new key-values from the configuration file. - **Unchecked**: Pushes all key-values from the configuration file into the App Configuration store and leaves everything else in the App Configuration store intact.
-After filling out required parameters, run the pipeline. All key-values in the specified configuration file will be uploaded to App Configuration.
+ ## Troubleshooting
azure-cache-for-redis Cache Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-overview.md
Last updated 02/08/2021
-#As a developer, I want to understand what Azure Cache for Redis is and how it can improve performance in my application.
+#Customer intent: As a developer, I want to understand what Azure Cache for Redis is and how it can improve performance in my application.
# About Azure Cache for Redis
azure-functions Durable Functions Serialization And Persistence https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/durable/durable-functions-serialization-and-persistence.md
Last updated 02/11/2021
-#Customer intent: As a developer, I want to understand what data is persisted to durable storage, how that data is serialized, and how
-#I can customize it when it doesn't work the way my app needs it to.
+#Customer intent: As a developer, I want to understand what data is persisted to durable storage, how that data is serialized, and how I can customize it when it doesn't work the way my app needs it to.
# Data persistence and serialization in Durable Functions (Azure Functions)
azure-functions Functions Bindings Storage Blob Input https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-blob-input.md
The `dataType` property determines which binding is used. The following values a
| Binding value | Default | Description | Example | | | | | |
-| `undefined` | Y | Uses rich binding | `def main(input: func.InputStream)` |
| `string` | N | Uses generic binding and casts the input type as a `string` | `def main(input: str)` | | `binary` | N | Uses generic binding and casts the input blob as `bytes` Python object | `def main(input: bytes)` |
+If the `dataType` property is not defined in function.json, the default value is `string`.
+ Here's the Python code: ```python
import logging
import azure.functions as func
-def main(queuemsg: func.QueueMessage, inputblob: func.InputStream) -> func.InputStream:
- logging.info('Python Queue trigger function processed %s', inputblob.name)
+# The type func.InputStream is not supported for blob input binding.
+# The input binding field inputblob can either be 'bytes' or 'str' depends
+# on dataType in function.json, 'binary' or 'string'.
+def main(queuemsg: func.QueueMessage, inputblob: bytes) -> bytes:
+ logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
return inputblob ```
azure-functions Functions Bindings Storage Blob Output https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-blob-output.md
In the *function.json* file, the `queueTrigger` metadata property is used to spe
{ "name": "inputblob", "type": "blob",
+ "dataType": "binary",
"path": "samples-workitems/{queueTrigger}", "connection": "MyStorageConnectionAppSetting", "direction": "in"
In the *function.json* file, the `queueTrigger` metadata property is used to spe
{ "name": "outputblob", "type": "blob",
+ "dataType": "binary",
"path": "samples-workitems/{queueTrigger}-Copy", "connection": "MyStorageConnectionAppSetting", "direction": "out"
import logging
import azure.functions as func
-def main(queuemsg: func.QueueMessage, inputblob: func.InputStream,
- outputblob: func.Out[func.InputStream]):
- logging.info('Python Queue trigger function processed %s', inputblob.name)
+def main(queuemsg: func.QueueMessage, inputblob: bytes, outputblob: func.Out[bytes]):
+ logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
outputblob.set(inputblob) ```
azure-functions Functions Create Student Starter https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-create-student-starter.md
Title: Create a function using Azure for Students Starter description: Learn how to create an Azure Function from within an Azure for Student Starter subscription
-Customer intent: As a student, I want to be able to create an HTTP triggered Function App within the Student Starter plan so that I can easily add APIs to any project.
+# Customer intent: As a student, I want to be able to create an HTTP triggered Function App within the Student Starter plan so that I can easily add APIs to any project.
Last updated 04/29/2020
azure-functions Python Memory Profiler Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/python-memory-profiler-reference.md
+
+ Title: Memory profiling on Python apps in Azure Functions
+description: Learn how to profile Python apps memory usage and identify memory bottleneck.
+++ Last updated : 3/22/2021++
+# Profile Python apps memory usage in Azure Functions
+
+During development or after deploying your local Python function app project to Azure, it's a good practice to analyze for potential memory bottlenecks in your functions. Such bottlenecks can decrease the performance of your functions and lead to errors. The following instruction show you how to use the [memory-profiler](https://pypi.org/project/memory-profiler) Python package, which provides line-by-line memory consumption analysis of your functions as they execute.
+
+> [!NOTE]
+> Memory profiling is intended only for memory footprint analysis on development environment. Please do not apply the memory profiler on production function apps.
+
+## Prerequisites
+
+Before you start developing a Python function app, you must meet these requirements:
+
+* [Python 3.6.x or above](https://www.python.org/downloads/release/python-374/). To check the full list of supported Python versions in Azure Functions, please visit [Python developer guide](functions-reference-python.md#python-version).
+
+* The [Azure Functions Core Tools](functions-run-local.md#v2) version 3.x.
+
+* [Visual Studio Code](https://code.visualstudio.com/) installed on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
+
+* An active Azure subscription.
++
+## Memory profiling process
+
+1. In your requirements.txt, add `memory-profiler` to ensure the package will be bundled with your deployment. If you are developing on your local machine, you may want to [activate a Python virtual environment](create-first-function-cli-python.md#create-venv) and do a package resolution by `pip install -r requirements.txt`.
+
+2. In your function script (usually \_\_init\_\_.py), add the following lines above the `main()` function. This will ensure the root logger reports the child logger names, so that the memory profiling logs are distinguishable by the prefix `memory_profiler_logs`.
+
+ ```python
+ import logging
+ import memory_profiler
+ root_logger = logging.getLogger()
+ root_logger.handlers[0].setFormatter(logging.Formatter("%(name)s: %(message)s"))
+ profiler_logstream = memory_profiler.LogFile('memory_profiler_logs', True)
+
+3. Apply the following decorator above any functions that need memory profiling. This does not work directly on the trigger entrypoint `main()` method. You need to create subfunctions and decorate them. Also, due to a memory-profiler known issue, when applying to an async coroutine, the coroutine return value will always be None.
+
+ ```python
+ @memory_profiler.profile(stream=memory_logger)
+
+4. Test the memory profiler on your local machine by using azure Functions Core Tools command `func host start`. This should generate a memory usage report with file name, line of code, memory usage, memory increment, and the line content in it.
+
+5. To check the memory profiling logs on an existing function app instance in Azure, you can query the memory profiling logs in recent invocations by pasting the following Kusto queries in Application Insights, Logs.
++
+```text
+traces
+| where timestamp > ago(1d)
+| where message startswith_cs "memory_profiler_logs:"
+| parse message with "memory_profiler_logs: " LineNumber " " TotalMem_MiB " " IncreMem_MiB " " Occurences " " Contents
+| union (
+ traces
+ | where timestamp > ago(1d)
+ | where message startswith_cs "memory_profiler_logs: Filename: "
+ | parse message with "memory_profiler_logs: Filename: " FileName
+ | project timestamp, FileName, itemId
+)
+| project timestamp, LineNumber=iff(FileName != "", FileName, LineNumber), TotalMem_MiB, IncreMem_MiB, Occurences, Contents, RequestId=itemId
+| order by timestamp asc
+```
+
+## Example
+
+Here is an example of performing memory profiling on an asynchronous and a synchronous HTTP triggers, named "HttpTriggerAsync" and "HttpTriggerSync" respectively. We will build a Python function app that simply sends out GET requests to the Microsoft's home page.
+
+### Create a Python function app
+
+A Python function app should follow Azure Functions specified [folder structure](functions-reference-python.md#folder-structure). To scaffold the project, we recommend using the Azure Functions Core Tools by running the following commands:
+
+```bash
+func init PythonMemoryProfilingDemo --python
+cd PythonMemoryProfilingDemo
+func new -l python -t HttpTrigger -n HttpTriggerAsync -a anonymous
+func new -l python -t HttpTrigger -n HttpTriggerSync -a anonymous
+```
+
+### Update file contents
+
+The requirements.txt defines the packages that will be used in our project. Besides the Azure Functions SDK and memory-profiler, we introduce `aiohttp` for asynchronous HTTP requests and `requests` for synchronous HTTP calls.
+
+```text
+# requirements.txt
+
+azure-functions
+memory-profiler
+aiohttp
+requests
+```
+
+We also need to rewrite the asynchronous HTTP trigger `HttpTriggerAsync/__init__.py` and configure the memory profiler, root logger format, and logger streaming binding.
+
+```python
+# HttpTriggerAsync/__init__.py
+
+import azure.functions as func
+import aiohttp
+import logging
+import memory_profiler
+
+# Update root logger's format to include the logger name. Ensure logs generated
+# from memory profiler can be filtered by "memory_profiler_logs" prefix.
+root_logger = logging.getLogger()
+root_logger.handlers[0].setFormatter(logging.Formatter("%(name)s: %(message)s"))
+profiler_logstream = memory_profiler.LogFile('memory_profiler_logs', True)
+
+async def main(req: func.HttpRequest) -> func.HttpResponse:
+ await get_microsoft_page_async('https://microsoft.com')
+ return func.HttpResponse(
+ f"Microsoft Page Is Loaded",
+ status_code=200
+ )
+
+@memory_profiler.profile(stream=profiler_logstream)
+async def get_microsoft_page_async(url: str):
+ async with aiohttp.ClientSession() as client:
+ async with client.get(url) as response:
+ await response.text()
+ # @memory_profiler.profile does not support return for coroutines.
+ # All returns become None in the parent functions.
+ # GitHub Issue: https://github.com/pythonprofilers/memory_profiler/issues/289
+```
+
+For synchronous HTTP trigger, please refer to the following `HttpTriggerSync/__init__.py` code section:
+
+```python
+# HttpTriggerSync/__init__.py
+
+import azure.functions as func
+import requests
+import logging
+import memory_profiler
+
+# Update root logger's format to include the logger name. Ensure logs generated
+# from memory profiler can be filtered by "memory_profiler_logs" prefix.
+root_logger = logging.getLogger()
+root_logger.handlers[0].setFormatter(logging.Formatter("%(name)s: %(message)s"))
+profiler_logstream = memory_profiler.LogFile('memory_profiler_logs', True)
+
+def main(req: func.HttpRequest) -> func.HttpResponse:
+ content = profile_get_request('https://microsoft.com')
+ return func.HttpResponse(
+ f"Microsoft Page Response Size: {len(content)}",
+ status_code=200
+ )
+
+@memory_profiler.profile(stream=profiler_logstream)
+def profile_get_request(url: str):
+ response = requests.get(url)
+ return response.content
+```
+
+### Profile Python function app in local development environment
+
+After making all the above changes, there are a few more steps to initialize a Python virtual envionment for Azure Functions runtime.
+
+1. Open a Windows PowerShell or any Linux shell as you prefer.
+2. Create a Python virtual environment by `py -m venv .venv` in Windows, or `python3 -m venv .venv` in Linux.
+3. Activate the Python virutal environment with `.venv\Scripts\Activate.ps1` in Windows PowerShell or `source .venv/bin/activate` in Linux shell.
+4. Restore the Python dependencies with `pip install requirements.txt`
+5. Start the Azure Functions runtime locally with Azure Functions Core Tools `func host start`
+6. Send a GET request to `https://localhost:7071/api/HttpTriggerAsync` or `https://localhost:7071/api/HttpTriggerSync`.
+7. It should show a memory profiling report similiar to below section in Azure Functions Core Tools.
+
+ ```text
+ Filename: <ProjectRoot>\HttpTriggerAsync\__init__.py
+ Line # Mem usage Increment Occurences Line Contents
+ ============================================================
+ 19 45.1 MiB 45.1 MiB 1 @memory_profiler.profile
+ 20 async def get_microsoft_page_async(url: str):
+ 21 45.1 MiB 0.0 MiB 1 async with aiohttp.ClientSession() as client:
+ 22 46.6 MiB 1.5 MiB 10 async with client.get(url) as response:
+ 23 47.6 MiB 1.0 MiB 4 await response.text()
+ ```
+
+## Next steps
+
+For more information about Azure Functions Python development, see the following resources:
+
+* [Azure Functions Python developer guide](functions-reference-python.md)
+* [Best practices for Azure Functions](functions-best-practices.md)
+* [Azure Functions developer reference](functions-reference.md)
azure-functions Recover Python Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/recover-python-functions.md
Last updated 07/29/2020 -+ # Troubleshoot Python errors in Azure Functions
Following is a list of troubleshooting guides for common issues in Python functi
* [ModuleNotFoundError and ImportError](#troubleshoot-modulenotfounderror) * [Cannot import 'cygrpc'](#troubleshoot-cannot-import-cygrpc)
+* [Python exited with code 137](#troubleshoot-python-exited-with-code-137)
+* [Python exited with code 139](#troubleshoot-python-exited-with-code-139)
## Troubleshoot ModuleNotFoundError
This section helps you troubleshoot module-related errors in your Python functio
> `Exception: ModuleNotFoundError: No module named 'module_name'.`
-This error issue occurs when a Python function app fails to load a Python module. The root cause for this error is one of the following issues:
+This error occurs when a Python function app fails to load a Python module. The root cause for this error is one of the following issues:
-- [The package can't be found](#the-package-cant-be-found)-- [The package isn't resolved with proper Linux wheel](#the-package-isnt-resolved-with-proper-linux-wheel)-- [The package is incompatible with the Python interpreter version](#the-package-is-incompatible-with-the-python-interpreter-version)-- [The package conflicts with other packages](#the-package-conflicts-with-other-packages)-- [The package only supports Windows or macOS platforms](#the-package-only-supports-windows-or-macos-platforms)
+* [The package can't be found](#the-package-cant-be-found)
+* [The package isn't resolved with proper Linux wheel](#the-package-isnt-resolved-with-proper-linux-wheel)
+* [The package is incompatible with the Python interpreter version](#the-package-is-incompatible-with-the-python-interpreter-version)
+* [The package conflicts with other packages](#the-package-conflicts-with-other-packages)
+* [The package only supports Windows or macOS platforms](#the-package-only-supports-windows-or-macos-platforms)
### View project files To identify the actual cause of your issue, you need to get the Python project files that run on your function app. If you don't have the project files on your local computer, you can get them in one of the following ways: -- If the function app has `WEBSITE_RUN_FROM_PACKAGE` app setting and its value is a URL, download the file by copy and paste the URL into your browser.-- If the function app has `WEBSITE_RUN_FROM_PACKAGE` and it is set to `1`, navigate to `https://<app-name>.scm.azurewebsites.net/api/vfs/data/SitePackages` and download the file from the latest `href` URL.-- If the function app doesn't have the app setting mentioned above, navigate to `https://<app-name>.scm.azurewebsites.net/api/settings` and find the URL under `SCM_RUN_FROM_PACKAGE`. Download the file by copy and paste the URL into your browser.-- If none of these works for you, navigate to `https://<app-name>.scm.azurewebsites.net/DebugConsole` and reveal the content under `/home/site/wwwroot`.
+* If the function app has `WEBSITE_RUN_FROM_PACKAGE` app setting and its value is a URL, download the file by copy and paste the URL into your browser.
+* If the function app has `WEBSITE_RUN_FROM_PACKAGE` and it is set to `1`, navigate to `https://<app-name>.scm.azurewebsites.net/api/vfs/data/SitePackages` and download the file from the latest `href` URL.
+* If the function app doesn't have the app setting mentioned above, navigate to `https://<app-name>.scm.azurewebsites.net/api/settings` and find the URL under `SCM_RUN_FROM_PACKAGE`. Download the file by copy and paste the URL into your browser.
+* If none of these works for you, navigate to `https://<app-name>.scm.azurewebsites.net/DebugConsole` and reveal the content under `/home/site/wwwroot`.
The rest of this article helps you troubleshoot potential causes of this error by inspecting your function app's content, identifying the root cause, and resolving the specific issue.
This section helps you troubleshoot 'cygrpc' related errors in your Python funct
> `Cannot import name 'cygrpc' from 'grpc._cython'`
-This error issue occurs when a Python function app fails to start with a proper Python interpreter. The root cause for this error is one of the following issues:
+This error occurs when a Python function app fails to start with a proper Python interpreter. The root cause for this error is one of the following issues:
- [The Python interpreter mismatches OS architecture](#the-python-interpreter-mismatches-os-architecture) - [The Python interpreter is not supported by Azure Functions Python Worker](#the-python-interpreter-is-not-supported-by-azure-functions-python-worker)
Please check if your Python interpreter matches our expected version by `py --ve
If your Python interpreter version does not meet our expectation, please download the Python 3.6, 3.7, or 3.8 interpreter from [Python Software Foundation](https://python.org/downloads/release). ++
+## Troubleshoot Python Exited With Code 137
+
+Code 137 errors are typically caused by out-of-memory issues in your Python function app. As a result, you get the following Azure Functions error message:
+
+> `Microsoft.Azure.WebJobs.Script.Workers.WorkerProcessExitException : python exited with code 137`
+
+This error occurs when a Python function app is forced to terminate by the operating system with a SIGKILL signal. This signal usually indicates an out-of-memory error in your Python process. The Azure Functions platform has a [service limitation](functions-scale.md#service-limits) which will terminate any function apps that exceeded this limit.
+
+Please visit the tutorial section in [memory profiling on Python functions](python-memory-profiler-reference.md#memory-profiling-process) to analyze the memory bottleneck in your function app.
+++
+## Troubleshoot Python Exited With Code 139
+
+This section helps you troubleshoot segmentation fault errors in your Python function app. These errors typically result in the following Azure Functions error message:
+
+> `Microsoft.Azure.WebJobs.Script.Workers.WorkerProcessExitException : python exited with code 139`
+
+This error occurs when a Python function app is forced to terminate by the operating system with a SIGSEGV signal. This signal indicates a memory segmentation violation which can be caused by unexpectedly reading from or writing into a restricted memory region. In the following sections, we provide a list of common root causes.
+
+### A regression from third-party packages
+
+In your function app's requirements.txt, an unpinned package will be upgraded to the latest version in every Azure Functions deployment. Vendors of these packages may introduce regressions in their latest release. To recover from this issue, try commenting out the import statements, disabling the package references, or pinning the package to a previous version in requirements.txt.
+
+### Unpickling from a malformed .pkl file
+
+If your function app is using the Python pickel library to load Python object from .pkl file, it is possible that the .pkl contains malformed bytes string, or invalid address reference in it. To recover from this issue, try commenting out the pickle.load() function.
+++ ## Next steps If you're unable to resolve your issue, please report this to the Functions team:
azure-government Secure Azure Computing Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/compliance/secure-azure-computing-architecture.md
-# This basic template provides core metadata fields for Markdown articles on docs.microsoft.com.
-
-# Mandatory fields.
Title: Secure Azure Computing Architecture description: Learn about the Secure Azure Computing Architecture (SACA). Using SACA allows U.S. DoD and civilian customers to comply with the SCCA FRD. Last updated 4/9/2019
-# Use ms.service for services or ms.prod for on-premises products. Remove the # before the relevant field.
-# ms.prod: product-name-from-allow-list
-
-# Optional fields. Don't forget to remove # if you need a field.
-#
-#
-#
+ # Secure Azure Computing Architecture U.S. Department of Defense (DoD) customers who deploy workloads to Azure have asked for guidance to set up secure virtual networks and configure the security tools and services that are stipulated by DoD standards and practice.
azure-monitor Asp Net Troubleshoot No Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/asp-net-troubleshoot-no-data.md
For more information,
## Collect logs with dotnet-trace
-An alternate method of collecting logs for troubleshooting that may be particularly helpful for linux-based environments is [`dotnet-trace`](/dotnet/core/diagnostics/dotnet-trace)
+Alternatively, customers can also use a cross-platform .NET Core tool, [`dotnet-trace`](/dotnet/core/diagnostics/dotnet-trace) for collecting logs that can further help in troubleshooting. This may be particularly helpful for linux-based environments.
+
+After installation of [`dotnet-trace`](/dotnet/core/diagnostics/dotnet-trace), execute the command below in bash.
```bash dotnet-trace collect --process-id <PID> --providers Microsoft-ApplicationInsights-Core,Microsoft-ApplicationInsights-Data,Microsoft-ApplicationInsights-WindowsServer-TelemetryChannel,Microsoft-ApplicationInsights-Extensibility-AppMapCorrelation-Dependency,Microsoft-ApplicationInsights-Extensibility-AppMapCorrelation-Web,Microsoft-ApplicationInsights-Extensibility-DependencyCollector,Microsoft-ApplicationInsights-Extensibility-HostingStartup,Microsoft-ApplicationInsights-Extensibility-PerformanceCollector,Microsoft-ApplicationInsights-Extensibility-EventCounterCollector,Microsoft-ApplicationInsights-Extensibility-PerformanceCollector-QuickPulse,Microsoft-ApplicationInsights-Extensibility-Web,Microsoft-ApplicationInsights-Extensibility-WindowsServer,Microsoft-ApplicationInsights-WindowsServer-Core,Microsoft-ApplicationInsights-LoggerProvider,Microsoft-ApplicationInsights-Extensibility-EventSourceListener,Microsoft-ApplicationInsights-AspNetCore
Learn how to remove Application Insights in Visual Studio by following the steps
## Still not working... * [Microsoft Q&A question page for Application Insights](/answers/topics/azure-monitor.html)-
azure-monitor Get Metric https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/get-metric.md
Last updated 04/28/2020
The Azure Monitor Application Insights .NET and .NET Core SDKs have two different methods of collecting custom metrics, `TrackMetric()`, and `GetMetric()`. The key difference between these two methods is local aggregation. `TrackMetric()` lacks pre-aggregation while `GetMetric()` has pre-aggregation. The recommended approach is to use aggregation, therefore, `TrackMetric()` is no longer the preferred method of collecting custom metrics. This article will walk you through using the GetMetric() method, and some of the rationale behind how it works.
-## TrackMetric versus GetMetric
+## Pre-aggregating vs non pre-aggregating API
`TrackMetric()` sends raw telemetry denoting a metric. It is inefficient to send a single telemetry item for each value. `TrackMetric()` is also inefficient in terms of performance since every `TrackMetric(item)` goes through the full SDK pipeline of telemetry initializers and processors. Unlike `TrackMetric()`, `GetMetric()` handles local pre-aggregation for you and then only submits an aggregated summary metric at a fixed interval of one minute. So if you need to closely monitor some custom metric at the second or even millisecond level you can do so while only incurring the storage and network traffic cost of only monitoring every minute. This also greatly reduces the risk of throttling occurring since the total number of telemetry items that need to be sent for an aggregated metric are greatly reduced.
computersSold.TrackValue(100, "Dim1Value1", "Dim2Value3");
// The above call does not track the metric, and returns false. ```
-* `seriesCountLimit` is the max number of data time series a metric can contain. Once this limit is reached, calls to `TrackValue()` will not be tracked.
+* `seriesCountLimit` is the max number of data time series a metric can contain. Once this limit is reached, calls to `TrackValue()` that would normally result in a new series will return false.
* `valuesPerDimensionLimit` limits the number of distinct values per dimension in a similar manner. * `restrictToUInt32Values` determines whether or not only non-negative integer values should be tracked.
azure-monitor Java Standalone Sampling Overrides https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/java-standalone-sampling-overrides.md
If no sampling overrides match:
is used. * If this is not the first span in the trace, then the parent sampling decision is used.
-> [!IMPORTANT]
+> [!WARNING]
> When a decision has been made to not collect a span, then all downstream spans will also not be collected, > even if there are sampling overrides that match the downstream span. > This behavior is necessary because otherwise broken traces would result, with downstream spans being collected
azure-monitor Metrics Charts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/essentials/metrics-charts.md
To create another chart that uses a different metric, select **Add chart**.
To reorder or delete multiple charts, select the ellipsis (**...**) button to open the chart menu. Then choose **Move up**, **Move down**, or **Delete**.
+## Time range controls
+
+In addition to changing the time range using the [time picker panel](metrics-getting-started.md#select-a-time-range), you can also pan and zoom using the controls in the chart area.
+### Pan
+
+To pan, click on the left and right arrows at the edge of the chart. This will move the selected time range back and forward by one half the chart's time span. For example, if you're viewing the past 24 hours, clicking on the left arrow will cause the time range to shift to span a day and a half to 12 hours ago.
+
+Most metrics support 93 days of retention but only let you view 30 days at a time. Using the pan controls, you look at the past 30 days and then easily walk back 15 days at a time to view the rest of the retention period.
+
+![Animated gif showing the left and right pan controls.](./media/metrics-charts/metrics-pan-controls.gif)
+
+### Zoom
+
+You can click and drag on the chart to zoom into a section of a chart. Zooming will update the chart's time range to span your selection and will select a smaller time grain if the time grain is set to "Automatic". The new time range will apply to all charts in Metrics.
+
+![Animated gif showing the metrics zoom feature.](./media/metrics-charts/metrics-zoom-control.gif)
+ ## Aggregation When you add a metric to a chart, the metrics explorer automatically applies a default aggregation. The default makes sense in basic scenarios. But you can use a different aggregation to gain more insights about the metric.
azure-monitor Metrics Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/essentials/metrics-getting-started.md
To create a metric chart, from your resource, resource group, subscription, or A
## Select a time range > [!WARNING]
-> [Most metrics in Azure are stored for 93 days](../essentials/data-platform-metrics.md#retention-of-metrics). However, you can query no more than 30 days worth of data on any single chart. This limitation doesn't apply to [log-based metrics](../app/pre-aggregated-metrics-log-metrics.md#log-based-metrics).
+> [Most metrics in Azure are stored for 93 days](../essentials/data-platform-metrics.md#retention-of-metrics). However, you can query no more than 30 days worth of data on any single chart. You can [pan](metrics-charts.md#pan) the chart to view the full retention. The 30 day limitation doesn't apply to [log-based metrics](../app/pre-aggregated-metrics-log-metrics.md#log-based-metrics).
By default, the chart shows the most recent 24 hours of metrics data. Use the **time picker** panel to change the time range, zoom in, or zoom out on your chart.
azure-monitor Metrics Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/essentials/metrics-troubleshoot.md
Some resources donΓÇÖt constantly emit their metrics. For example, Azure will no
[Most metrics in Azure are stored for 93 days](../essentials/data-platform-metrics.md#retention-of-metrics). However, you can only query for no more than 30 days worth of data on any single chart. This limitation doesn't apply to [log-based metrics](../app/pre-aggregated-metrics-log-metrics.md#log-based-metrics).
-**Solution:** If you see a blank chart or your chart only displays part of metric data, verify that the difference between start- and end- dates in the time picker doesn't exceed the 30-day interval.
+**Solution:** If you see a blank chart or your chart only displays part of metric data, verify that the difference between start- and end- dates in the time picker doesn't exceed the 30-day interval. Once you have selected a 30 day interval, you can [pan](metrics-charts.md#pan) the chart to view the full retention window.
### All metric values were outside of the locked y-axis range
azure-monitor Network Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/insights/network-insights-overview.md
Title: Azure Monitor for Networks
-description: An overview of Azure Monitor for Networks, which provides a comprehensive view of health and metrics for all deployed network resources without any configuration.
+ Title: Azure Monitor Network Insights
+description: An overview of Azure Monitor Network Insights, which provides a comprehensive view of health and metrics for all deployed network resources without any configuration.
Last updated 11/25/2020
-# Azure Monitor for Networks
+# Azure Monitor Network Insights
-Azure Monitor for Networks provides a comprehensive view of [health](../../service-health/resource-health-checks-resource-types.md) and [metrics](../essentials/metrics-supported.md) for all deployed network resources, without requiring any configuration. It also provides access to network monitoring capabilities like [Connection Monitor](../../network-watcher/connection-monitor-overview.md), [flow logging for network security groups (NSGs)](../../network-watcher/network-watcher-nsg-flow-logging-overview.md), and [Traffic Analytics](../../network-watcher/traffic-analytics.md). And it provides other network [diagnostic](../../network-watcher/network-watcher-monitoring-overview.md#diagnostics) features.
+Azure Monitor Network Insights provides a comprehensive view of [health](../../service-health/resource-health-checks-resource-types.md) and [metrics](../essentials/metrics-supported.md) for all deployed network resources, without requiring any configuration. It also provides access to network monitoring capabilities like [Connection Monitor](../../network-watcher/connection-monitor-overview.md), [flow logging for network security groups (NSGs)](../../network-watcher/network-watcher-nsg-flow-logging-overview.md), and [Traffic Analytics](../../network-watcher/traffic-analytics.md). And it provides other network [diagnostic](../../network-watcher/network-watcher-monitoring-overview.md#diagnostics) features.
-Azure Monitor for Networks is structured around these key components of monitoring:
+Azure Monitor Network Insights is structured around these key components of monitoring:
- [Network health and metrics](#networkhealth) - [Connectivity](#connectivity) - [Traffic](#traffic)
Azure Monitor for Networks is structured around these key components of monitori
## <a name="networkhealth"></a>Network health and metrics
-The Azure Monitor for Networks **Overview** page provides an easy way to visualize the inventory of your networking resources, together with resource health and alerts. It's divided into four key functional areas: search and filtering, resource health and metrics, alerts, and dependency view.
+The Azure Monitor Network Insights **Overview** page provides an easy way to visualize the inventory of your networking resources, together with resource health and alerts. It's divided into four key functional areas: search and filtering, resource health and metrics, alerts, and dependency view.
[![Screenshot that shows the Overview page](media/network-insights-overview/overview.png)](media/network-insights-overview/overview.png#lightbox)
You can customize the resource health and alerts view by using filters like **Su
You can use the search box to search for resources and their associated resources. For example, a public IP is associated with an application gateway. A search for the public IP's DNS name will return both the public IP and the associated application gateway:
-[![Screenshot that shows Azure Monitor for Networks search results.](media/network-insights-overview/search.png)](media/network-insights-overview/search.png#lightbox)
+[![Screenshot that shows Azure Monitor Network Insights search results.](media/network-insights-overview/search.png)](media/network-insights-overview/search.png#lightbox)
### Resource health and metrics In the following example, each tile represents a resource type. The tile displays the number of instances of that resource type deployed across all selected subscriptions. It also displays the health status of the resource. In this example, there are 105 ER and VPN connections deployed. 103 are healthy, and 2 are unavailable.
-![Screenshot that shows resource health and metrics in Azure Monitor for Networks.](media/network-insights-overview/resource-health.png)
+![Screenshot that shows resource health and metrics in Azure Monitor Network Insights.](media/network-insights-overview/resource-health.png)
If you select the unavailable ER and VPN connections, you'll see a metric view:
-![Screenshot that shows the metric view in Azure Monitor for Networks.](media/network-insights-overview/metric-view.png)
+![Screenshot that shows the metric view in Azure Monitor Network Insights.](media/network-insights-overview/metric-view.png)
You can select any item in the grid view. Select the icon in the **Health** column to get resource health for that connection. Select the value in the **Alert** column to go to the alerts and metrics page for the connection.
The **Alert** box on the right side of the page provides a view of all alerts ge
### Dependency view Dependency view helps you visualize how a resource is configured. Dependency view is currently available for Azure Application Gateway, Azure Virtual WAN, and Azure Load Balancer. For example, for Application Gateway, you can access dependency view by selecting the Application Gateway resource name in the metrics grid view. You can do the same thing for Virtual WAN and Load Balancer.
-![Sreenshot that shows Application Gateway view in Azure Monitor for Networks.](media/network-insights-overview/application-gateway.png)
+![Sreenshot that shows Application Gateway view in Azure Monitor Network Insights.](media/network-insights-overview/application-gateway.png)
The dependency view for Application Gateway provides a simplified view of how the front-end IPs are connected to the listeners, rules, and backend pool. The connecting lines are color coded and provide additional details based on the backend pool health. The view also provides a detailed view of Application Gateway metrics and metrics for all related backend pools, like virtual machine scale set and VM instances.
-[![Screenshot that shows dependency view in Azure Monitor for Networks.](media/network-insights-overview/dependency-view.png)](media/network-insights-overview/dependency-view.png#lightbox)
+[![Screenshot that shows dependency view in Azure Monitor Network Insights.](media/network-insights-overview/dependency-view.png)](media/network-insights-overview/dependency-view.png#lightbox)
The dependency graph provides easy navigation to configuration settings. Right-click a backend pool to access other information. For example, if the backend pool is a VM, you can directly access VM Insights and Azure Network Watcher connection troubleshooting to identify connectivity issues:
-![Screenshot that shows the dependency view menu in Azure Monitor for Networks.](media/network-insights-overview/dependency-view-menu.png)
+![Screenshot that shows the dependency view menu in Azure Monitor Network Insights.](media/network-insights-overview/dependency-view-menu.png)
The search and filter bar on the dependency view provides an easy way to search through the graph. For example, if you search for **AppGWTestRule** in the previous example, the view will scale down to all nodes connected via AppGWTestRule:
-![Screenshot that shows an example of a search in Azure Monitor for Networks.](media/network-insights-overview/search-example.png)
+![Screenshot that shows an example of a search in Azure Monitor Network Insights.](media/network-insights-overview/search-example.png)
Various filters help you scale down to a specific path and state. For example, select only **Unhealthy** from the **Health status** list to show all edges for which the state is unhealthy.
Select **View detailed metrics** to open a preconfigured workbook that provides
The **Connectivity** tab provides an easy way to visualize all tests configured via [Connection Monitor](../../network-watcher/connection-monitor-overview.md) and Connection Monitor (classic) for the selected set of subscriptions.
-![Screenshot that shows the Connectivity tab in Azure Monitor for Networks.](media/network-insights-overview/azure-monitor-for-networks-connectivity-tab.png)
+![Screenshot that shows the Connectivity tab in Azure Monitor Network Insights.](media/network-insights-overview/azure-monitor-for-networks-connectivity-tab.png)
Tests are grouped by **Sources** and **Destinations** tiles and display the reachability status for each test. Reachable settings provide easy access to configurations for your reachability criteria, based on checks failed (%) and RTT (ms). After you set the values, the status for each test updates based on the selection criteria.
-[![Screenshot that shows connectivity tests in Azure Monitor for Networks.](media/network-insights-overview/azure-monitor-for-networks-connectivity-tests.png)](media/network-insights-overview/azure-monitor-for-networks-connectivity-tests.png#lightbox)
+[![Screenshot that shows connectivity tests in Azure Monitor Network Insights.](media/network-insights-overview/azure-monitor-for-networks-connectivity-tests.png)](media/network-insights-overview/azure-monitor-for-networks-connectivity-tests.png#lightbox)
You can select any source or destination tile to open a metric view:
-[![Screenshot that shows connectivity metrics in Azure Monitor for Networks.](media/network-insights-overview/azure-monitor-for-networks-connectivity-metrics.png)](media/network-insights-overview/azure-monitor-for-networks-connectivity-metrics.png#lightbox)
+[![Screenshot that shows connectivity metrics in Azure Monitor Network Insights.](media/network-insights-overview/azure-monitor-for-networks-connectivity-metrics.png)](media/network-insights-overview/azure-monitor-for-networks-connectivity-metrics.png#lightbox)
You can select any item in the grid view. Select the icon in the **Reachability** column to go to the Connection Monitor portal page and view the hop-by-hop topology and connectivity affecting issues identified. Select the value in the **Alert** column to go to alerts. Select the graphs in the **Checks Failed Percent** and **Round-Trip Time (ms)** columns to go to the metrics page for the selected connection monitor.
TheΓÇ»**Alert** box on the right side of the page provides a view of all alerts
## <a name="traffic"></a>Traffic The **Traffic** tab provides access to all NSGs configured for [NSG flow logs](../../network-watcher/network-watcher-nsg-flow-logging-overview.md) and [Traffic Analytics](../../network-watcher/traffic-analytics.md) for the selected set of subscriptions, grouped by location. The search functionality provided on this tab enables you to identify the NSGs configured for the searched IP address. You can search for any IP address in your environment. The tiled regional view will display all NSGs along with the NSG flow logs and Traffic Analytics configuration status.
-[![Screenshot that shows the Traffic tab in Azure Monitor for Networks.](media/network-insights-overview/azure-monitor-for-networks-traffic-view.png)](media/network-insights-overview/azure-monitor-for-networks-traffic-view.png#lightbox)
+[![Screenshot that shows the Traffic tab in Azure Monitor Network Insights.](media/network-insights-overview/azure-monitor-for-networks-traffic-view.png)](media/network-insights-overview/azure-monitor-for-networks-traffic-view.png#lightbox)
If you select any region tile, a grid view appears. The grid provides NSG flow logs and Traffic Analytics in a view that's easy to read and configure:
-[![Screenshot that shows the traffic region view in Azure Monitor for Networks.](media/network-insights-overview/azure-monitor-for-networks-traffic-region-view.png)](media/network-insights-overview/azure-monitor-for-networks-traffic-region-view.png#lightbox)
+[![Screenshot that shows the traffic region view in Azure Monitor Network Insights.](media/network-insights-overview/azure-monitor-for-networks-traffic-region-view.png)](media/network-insights-overview/azure-monitor-for-networks-traffic-region-view.png#lightbox)
You can select any item in the grid view. Select the icon in the **Flowlog Configuration Status** column to edit the NSG flow log and Traffic Analytics configuration. Select the value in the **Alert** column to go to the traffic alerts configured for the selected NSG. Similarly, you can go to the Traffic Analytics view by selecting the **Traffic Analytics Workspace**.
Onboarded resources have built-in workbooks, and dependency views. Currently onb
## Troubleshooting For general troubleshooting guidance, see the dedicated workbook-based insights [troubleshooting article](troubleshoot-workbooks.md).
-This section will help you diagnose and troubleshoot some common problems you might encounter when you use Azure Monitor for Networks.
+This section will help you diagnose and troubleshoot some common problems you might encounter when you use Azure Monitor Network Insights.
### How do I resolve performance problems or failures?
-To learn about troubleshooting any networking-related problems you identify with Azure Monitor for Networks, see the troubleshooting documentation for the malfunctioning resource.
+To learn about troubleshooting any networking-related problems you identify with Azure Monitor Network Insights, see the troubleshooting documentation for the malfunctioning resource.
Here are some links to troubleshooting articles for frequently used services. For more troubleshooting articles about these services, see the other articles in the Troubleshooting section of the table of contents for the service. * [Azure Virtual Network](../../virtual-network/virtual-network-troubleshoot-peering-issues.md)
Here are some links to troubleshooting articles for frequently used services. Fo
### Why don't I see the resources for all the subscriptions I've selected?
-Azure Monitor for Networks can show resources for only five subscriptions at a time.
+Azure Monitor Network Insights can show resources for only five subscriptions at a time.
-### How do I make changes or add visualizations to Azure Monitor for Networks?
+### How do I make changes or add visualizations to Azure Monitor Network Insights?
To make changes, select **Edit Mode** to modify the workbook. You can then save your changes as a new workbook that's tied to a designated subscription and resource group. ### What's the time grain after I pin any part of the workbooks?
-Azure Monitor for Networks uses the **Auto** time grain, so the time grain is based on the selected time range.
+Azure Monitor Network Insights uses the **Auto** time grain, so the time grain is based on the selected time range.
### What's the time range when any part of a workbook is pinned? The time range depends on the dashboard settings.
-### What if I want to see other data or make my own visualizations? How can I make changes to Azure Monitor for Networks?
+### What if I want to see other data or make my own visualizations? How can I make changes to Azure Monitor Network Insights?
You can edit the workbook you see in any side-panel or detailed metric view by using the edit mode. You can then save your changes as a new workbook.
azure-netapp-files Azure Netapp Files Resource Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/azure-netapp-files-resource-limits.md
na ms.devlang: na Previously updated : 01/29/2021 Last updated : 03/30/2021 # Resource limits for Azure NetApp Files
The following table describes resource limits for Azure NetApp Files:
| Maximum size of a single file | 16 TiB | No | | Maximum size of directory metadata in a single directory | 320 MB | No | | Maximum number of files ([maxfiles](#maxfiles)) per volume | 100 million | Yes |
+| Maximum number of export policy rules per volume | 5 | No |
| Minimum assigned throughput for a manual QoS volume | 1 MiB/s | No | | Maximum assigned throughput for a manual QoS volume | 4,500 MiB/s | No | | Number of cross-region replication data protection volumes (destination volumes) | 5 | Yes |
azure-netapp-files Configure Nfs Clients https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/configure-nfs-clients.md
The NFS client configuration described in this article is part of the setup when you [configure NFSv4.1 Kerberos encryption](configure-kerberos-encryption.md) or [create a dual-protocol volume](create-volumes-dual-protocol.md). A wide variety of Linux distributions are available to use with Azure NetApp Files. This article describes configurations for two of the more commonly used environments: RHEL 8 and Ubuntu 18.04.
+## Requirements and considerations
+ Regardless of the Linux flavor you use, the following configurations are required:+ * Configure an NTP client to avoid issues with time skew. * Configure DNS entries of the Linux client for name resolution. This configuration must include the ΓÇ£AΓÇ¥ (forward) record and the PTR (reverse) record .
azure-percept How To Set Up Over The Air Updates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-set-up-over-the-air-updates.md
Title: Set up Azure IoT Hub to deploy over the air updates
-description: Learn how to configure Azure IoT Hub to deploy updates over the air to Azure Percept DK
+ Title: Set up Azure IoT Hub to deploy over-the-air updates
+description: Learn how to configure Azure IoT Hub to deploy updates over-the-air to Azure Percept DK
Previously updated : 02/18/2021 Last updated : 03/30/2021 # How to set up Azure IoT Hub to deploy over the air updates to your Azure Percept DK+ Keep your Azure Percept DK secure and up to date using over-the-air updates. In a few simple steps, you will be able to set up your Azure environment with Device Update for IoT Hub and deploy the latest updates to your Azure Percept DK.
+## Prerequisites
+
+- Azure Percept DK (devkit)
+- [Azure subscription](https://azure.microsoft.com/free/)
+- [Azure Percept DK setup experience](./quickstart-percept-dk-set-up.md): you connected your dev kit to a Wi-Fi network, created an IoT Hub, and connected your dev kit to the IoT Hub
+ ## Create a Device Update Account
-1. Go to the [Azure portal](https://portal.azure.com) and sign in with the Azure account you are using with Azure Percept
+1. Go to the [Azure portal](https://portal.azure.com) and sign in with the Azure account you are using with Azure Percept.
+
+1. In the search bar at the top of the page, enter **Device Update for IoT Hub**.
-1. In the search window at the top of the page, begin typing ΓÇ£Device Update for IoT HubΓÇ¥
+1. Select **Device Update for IoT Hub** when it appears in the search bar.
-1. Select **Device Update for IoT Hubs** as it appears in the search window.
+1. Click the **+Add** button in the upper-left portion of the page.
-1. Click the **+Add** button at the upper-left portion of the page.
+1. Select the **Azure Subscription** and **Resource Group** associated with your Azure Percept device and its IoT Hub.
-1. Select the Azure Subscription and Resource Group associated with your Azure Percept device (this is where the IoT Hub from onboarding is located).
+1. Specify a **Name** and **Location** for your Device Update Account.
-1. Specify a Name and Location for your Device Update Account
+1. Review the details and select **Review + Create**.
-1. Review the details and then select **Review + Create**.
-
1. Once deployment is complete, click **Go to resource**.
-
+ ## Create a Device Update Instance
-Now, create an instance within your Device Update for IoT Hub account.
1. In your Device Update for IoT Hub resource, click **Instances** under **Instance Management**.
-
-1. Click **+ Create**, specify an instance name and select the IoT Hub associated with your Azure Percept device (i.e., created during Onboarding Experience). This may take a few minutes to complete.
-
-1. Click **Create**
+
+1. Click **+ Create**, specify an instance name, and select the IoT Hub associated with your Azure Percept device. This may take a few minutes to complete.
+
+1. Click **Create**.
## Configure IoT Hub
-1. In the Instance Management **Instances** page, wait for your Device Update Instance to move to a **Succeeded** state. Click the **Refresh** icon next to **Delete** to update the state.
-
-1. Select the Instance that has been created for you and then click **Configure IoT Hub**. In the left pane, select **I agree to make these changes** and click **Update**.
-
+1. In the Instance Management **Instances** page, wait for your Device Update Instance to move to a **Succeeded** state. Click the **Refresh** icon to update the state.
+
+1. Select the Instance that has been created for you and click **Configure IoT Hub**. In the left pane, select **I agree to make these changes** and click **Update**.
+ 1. Wait for the process to complete successfully.
-
+ ## Configure access control roles+ The final step will enable you to grant permissions to users to publish and deploy updates.
-1. In your Device Update for IoT Hub resource, click **Access control (IAM)**
-
-2. Click **+Add** and then select **Add role assignment**
-
-3. For **Role**, select **Device Update Administrator**. For **Assign access to** select **User, group, or service principle**. For **Select** select your account or the account of the person who will be deploying updates. Then, click **Save**.
+1. In your Device Update for IoT Hub resource, click **Access control (IAM)**.
+
+1. Click **+Add** and then select **Add role assignment**.
+
+1. For **Role**, select **Device Update Administrator**. For **Assign access to** select **User, group, or service principle**. For **Select**, select your account or the account of the person who will be deploying updates. Click **Save**.
- > [!TIP]
- > If you would like to give more people in your organization access, you can repeat this step and make each of these users a **Device Update Administrator**.
+> [!TIP]
+> If you would like to give more people in your organization access, you can repeat this step and make each of these users a **Device Update Administrator**.
## Next steps
-You are now set and can [update your Azure Percept dev kit over-the-air](./how-to-update-over-the-air.md) using Device Update for IoT Hub. Navigate to the Azure IoT Hub that you are using for your Azure Percept device.
+You are now ready to [update your Azure Percept dev kit over-the-air](./how-to-update-over-the-air.md) using Device Update for IoT Hub.
azure-percept How To Update Over The Air https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-update-over-the-air.md
Title: Update your Azure Percept DK over the air
-description: Learn how to receive over the air updates to your Azure Percept DK
+ Title: Update your Azure Percept DK over-the-air (OTA)
+description: Learn how to receive over-the air (OTA) updates to your Azure Percept DK
Previously updated : 02/18/2021 Last updated : 03/30/2021
-# Update your Azure Percept DK over the air
+# Update your Azure Percept DK over-the-air (OTA)
-Follow this guide to learn how to update the carrier board of your Azure Percept DK over-the-air with Device Update for IoT Hub.
+Follow this guide to learn how to update the OS and firmware of the carrier board of your Azure Percept DK over-the-air (OTA) with Device Update for IoT Hub.
-## Import your update file and manifest file.
+## Prerequisites
+
+- Azure Percept DK (devkit)
+- [Azure subscription](https://azure.microsoft.com/free/)
+- [Azure Percept DK setup experience](./quickstart-percept-dk-set-up.md): you connected your dev kit to a Wi-Fi network, created an IoT Hub, and connected your dev kit to the IoT Hub
+- [Device Update for IoT Hub has been successfully configured](./how-to-set-up-over-the-air-updates.md)
+
+## Import your update file and manifest file
> [!NOTE]
-> If you have already imported the update, you can skip directly to **Create a Device Updated Group**.
+> If you have already imported the update, you can skip directly to **Create a device update group**.
+
+1. [Download the appropriate manifest file (.json) and update file (.swu) for your Azure Percept device](https://go.microsoft.com/fwlink/?linkid=2155625).
1. Navigate to the Azure IoT Hub that you are using for your Azure Percept device. On the left-hand menu panel, select **Device Updates** under **Automatic Device Management**.
-
+ 1. You will see several tabs across the top of the screen. Select the **Updates** tab.
-
+ 1. Select **+ Import New Update** below the **Ready to Deploy** header.
-
-1. Click on the boxes under **Select Import Manifest File** and **Select Update Files** to select the appropriate manifest file (.json) and one update file (.swu). You can find these update files for your Azure Percept device [here](https://go.microsoft.com/fwlink/?linkid=2155625).
-
-1. Select the folder icon or text box under **Select a storage container**, then select the appropriate storage account.
-
-1. If youΓÇÖve already created a storage container, you can re-use it. Otherwise, select **+ Container** to create a new storage container for OTA updates. Select the container you wish to use and click **Select**.
-
- >[!Note]
- >If you do not have a container you will be asked to create one.
-
-1. Select **Submit** to start the import process. The submission process will take around 4 minutes.
-
- >[!Note]
- >You might be asked to add a Cross Origin Request (CORS) rule to access the selected storage container. Select **Add rule and retry** to proceed.
-
- >[!Note]
- >Due to the image size, you may see the page **Submitting…** For up to 5 min before seeing next step.
-
-1. The import process begins, and you are redirected to the **Import History** tab of the **Device Updates** page. Click **Refresh** to monitor progress while the import process is completed. Depending on the size of the update, this may take a few minutes or longer (during peak times, please expect the import service to take up to 1hr).
-
-1. When the Status column indicates the import has succeeded, select the **Ready to Deploy** tab and click **Refresh**. You should now see your imported update in the list.
-
-## Create a Device Update Group
+
+1. Click on the boxes under **Select Import Manifest File** and **Select Update Files** to select your manifest file (.json) and update file (.swu).
+
+1. Select the folder icon or text box under **Select a storage container** and select the appropriate storage account. If youΓÇÖve already created a storage container, you may re-use it. Otherwise, select **+ Container** to create a new storage container for OTA updates. Select the container you wish to use and click **Select**.
+
+1. Select **Submit** to start the import process. Due to the image size, the submission process may take up to 5 minutes.
+
+ > [!NOTE]
+ > You may be asked to add a Cross Origin Request (CORS) rule to access the selected storage container. Select **Add rule and retry** to proceed.
+
+1. When the import process begins, you will be redirected to the **Import History** tab of the **Device Updates** page. Click **Refresh** to monitor progress while the import process is completed. Depending on the size of the update, this may take a few minutes or longer (during peak times, the import service may to take up to 1 hour).
+
+1. When the **Status** column indicates that the import has succeeded, select the **Ready to Deploy** tab and click **Refresh**. You should now see your imported update in the list.
+
+## Create a device update group
+ Device Update for IoT Hub allows you to target an update to specific groups of Azure Percept DKs. To create a group, you must add a tag to your target set of devices in Azure IoT Hub. > [!NOTE]
-> If you have already created a group, you can skip directly to the next step.
+> If you have already created a group, you can skip to the next section.
Group Tag Requirements:-- You can add any value to your tag except for **Uncategorized**, which is a reserved value.+
+- You can add any value to your tag except for "Uncategorized", which is a reserved value.
- Tag value cannot exceed 255 characters. - Tag value can only contain these special characters: ΓÇ£.ΓÇ¥,ΓÇ¥-ΓÇ£,ΓÇ¥_ΓÇ¥,ΓÇ¥~ΓÇ¥. - Tag and group names are case sensitive. - A device can only have one tag. Any subsequent tag added to the device will override the previous tag. - A device can only belong to one group.
-1. Add a Tag to your device(s).
+1. Add a Tag to your device(s):
+ 1. From **IoT Edge** on the left navigation pane, find your Azure Percept DK and navigate to its **Device Twin**.
- 1. Add a new **Device Update for IoT Hub** tag value as shown below (Change ```<CustomTagValue>``` to your value, i.e. AzurePerceptGroup1). Learn more about device twin [JSON document tags](../iot-hub/iot-hub-devguide-device-twins.md#device-twins).
- ```
- "tags": {
- "ADUGroup": "<CustomTagValue>"
- },
- ```
+ 1. Add a new **Device Update for IoT Hub** tag value as shown below (```<CustomTagValue>``` refers to your tag value/name, e.g. AzurePerceptGroup1). Learn more about device twin [JSON document tags](../iot-hub/iot-hub-devguide-device-twins.md#device-twins).
+
+ ```
+ "tags": {
+ "ADUGroup": "<CustomTagValue>"
+ },
+ ```
-
1. Click **Save** and resolve any formatting issues.
-
-1. Create a group by selecting an existing Azure IoT Hub tag.
+
+1. Create a group by selecting an existing Azure IoT Hub tag:
+ 1. Navigate back to your Azure IoT Hub page.+ 1. Select **Device Updates** under **Automatic Device Management** on the left-hand menu panel.+ 1. Select the **Groups** tab. This page will display the number of ungrouped devices connected to Device Update.+ 1. Select **+ Add** to create a new group.+ 1. Select an IoT Hub tag from the list and click **Submit**.+ 1. Once the group is created, the update compliance chart and groups list will update. The chart shows the number of devices in various states of compliance: **On latest update**, **New updates available**, **Updates in progress**, and **Not yet grouped**.
-
## Deploy an update+ 1. You should see your newly created group with a new update listed under **Available updates** (you may need to refresh once). Select the update.
-
-1. Confirm that the correct device group is selected as the target device group. Select a **Start date** and **Start time** for your deployment, then click **Create deployment**.
- >[!CAUTION]
- >Setting the start time in the past will trigger the deployment immediately.
-
+1. Confirm that the correct device group is selected as the target device group. Select a **Start date** and **Start time** for your deployment, then click **Create deployment**.
+
+ > [!CAUTION]
+ > Setting the start time in the past will trigger the deployment immediately.
+ 1. Check the compliance chart. You should see the update is now in progress.
-
+ 1. After your update has completed, your compliance chart will reflect your new update status.
-
+ 1. Select the **Deployments** tab at the top of the **Device updates** page.
-
+ 1. Select your deployment to view the deployment details. You may need to click **Refresh** until the **Status** changes to **Succeeded**. ## Next steps
-Your dev kit is now successfully updated. You may continue development and operation with your devkit.
+Your dev kit is now successfully updated. You may continue development and operation with your dev kit.
azure-percept Vision Solution Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/vision-solution-troubleshooting.md
Previously updated : 02/18/2021 Last updated : 03/29/2021
See the following guidance for information on troubleshooting no-code vision sol
1. Your device modules will be listed under the **Modules** tab.
- :::image type="content" source="./media/vision-solution-troubleshooting/vision-device-modules-inline.png" alt-text="Iot Edge page for selected device showing the modules tab contents." lightbox= "./media/vision-solution-troubleshooting/vision-device-modules.png":::
+ :::image type="content" source="./media/vision-solution-troubleshooting/vision-device-modules-inline.png" alt-text="IoT Edge page for selected device showing the modules tab contents." lightbox= "./media/vision-solution-troubleshooting/vision-device-modules.png":::
## Delete a device
See the following guidance for information on troubleshooting no-code vision sol
1. Select **IoT Edge** and check the box next to your target device ID. Click the trash can icon to delete your device.
- :::image type="content" source="./media/vision-solution-troubleshooting/vision-delete-device.png" alt-text="Delete icon highlighted in Iot Edge homepage.":::
+ :::image type="content" source="./media/vision-solution-troubleshooting/vision-delete-device.png" alt-text="Delete icon highlighted in IoT Edge homepage.":::
## Eye module troubleshooting tips
-If there is a problem with **WebStreamModule**, ensure that **azureeyemodule**, which does the vision model inferencing, is running. To check the runtime status, go to the [Azure portal](https://portal.azure.com/?feature.canmodifystamps=true&Microsoft_Azure_Iothub=aduprod&microsoft_azure_marketplace_ItemHideKey=Microsoft_Azure_ADUHidden#home) and navigate to **All resources** -> **\<your IoT hub>** -> **IoT Edge** -> **\<your device ID>**. Click the **Modules** tab to see the runtime status of all installed modules.
+### Check the runtime status of azureeyemodule
+
+If there is a problem with **WebStreamModule**, ensure that **azureeyemodule**, which handles the vision model inferencing, is running. To check the runtime status, go to the [Azure portal](https://portal.azure.com/?feature.canmodifystamps=true&Microsoft_Azure_Iothub=aduprod&microsoft_azure_marketplace_ItemHideKey=Microsoft_Azure_ADUHidden#home) and navigate to **All resources** -> **\<your IoT hub>** -> **IoT Edge** -> **\<your device ID>**. Click the **Modules** tab to see the runtime status of all installed modules.
:::image type="content" source="./media/vision-solution-troubleshooting/over-the-air-iot-edge-device-page-inline.png" alt-text="Device module runtime status screen." lightbox= "./media/vision-solution-troubleshooting/over-the-air-iot-edge-device-page.png":::
If the runtime status of **azureeyemodule** is not listed as **running**, click
:::image type="content" source="./media/vision-solution-troubleshooting/firmware-desired-status-stopped.png" alt-text="Module setting configuration screen.":::
+### Update TelemetryInterval
+
+If you encounter the following count limitation error, the TelemetryInterval value in the azureeyemodule module twin settings will need to be updated.
+
+|Error Message|
+||
+|Total number of messages on IotHub 'xxxxxxxxx' exceeded the allocated quota. Max allowed message count: '8000', current message count: 'xxxx'. Send and Receive operations are blocked for this hub until the next UTC day. Consider increasing the units for this hub to increase the quota.|
+
+TelemetryInterval determines how often to send messages (in milliseconds) from the neural network. Azure subscriptions have a limited number of messages per day, depending on your subscription tier. If you find yourself locked out due to having sent too many messages, increase this to a higher number. 12000 (meaning once every 12 seconds) will give you a nice round 7200 messages per day, which is under the 8000 message limit for the free subscription.
+
+To update your TelemetryInterval value, follow these steps:
+
+1. Log in to the [Azure portal](https://ms.portal.azure.com/?feature.canmodifystamps=true&Microsoft_Azure_Iothub=aduprod#home) and open **All resources**.
+
+1. On the **All resources** page, click on the name of the IoT Hub that was provisioned to your devkit during the setup experience.
+
+1. On the left side of the IoT Hub page, click on **IoT Edge** under **Automatic Device Management**. On the IoT Edge devices page, find the device ID of your devkit. Click the device ID of your devkit to open its IoT Edge device page.
+
+1. Select **azureeyemodule** under the **Modules** tab.
+
+1. On the azureeyemodule page, open **Module Identity Twin**.
+
+ :::image type="content" source="./media/vision-solution-troubleshooting/module-page-inline.png" alt-text="Screenshot of module page." lightbox= "./media/vision-solution-troubleshooting/module-page.png":::
+
+1. Scroll down to **properties**. Note that the properties "Running" and "Logging" are not active at this time.
+
+ :::image type="content" source="./media/vision-solution-troubleshooting/module-identity-twin-inline.png" alt-text="Screenshot of module twin properties." lightbox= "./media/vision-solution-troubleshooting/module-identity-twin.png":::
+
+1. Update the **TelemetryInterval** value as desired and click the **Save** icon.
+ ## View device RTSP video stream View your device's RTSP video stream in [Azure Percept Studio](./how-to-view-video-stream.md) or [VLC media player](https://www.videolan.org/vlc/https://docsupdatetracker.net/index.html).
azure-resource-manager Region Move Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/region-move-support.md
Jump to a resource provider namespace:
> | Resource type | Region move | > | - | -- | > | capabilities | No |
-> | domainnames | Yes | No |
+> | domainnames | No |
> | quotas | No | > | resourcetypes | No | > | validatesubscriptionmoveavailability | No |
Jump to a resource provider namespace:
> [!div class="mx-tableFixed"] > | Resource type | Region move | > | - | -- |
-> | accounts | No. [Learn more](../../azure-monitor/faq.md#how-do-i-move-an-application-insights-resource-to-a-new-region).
+> | accounts | No. [Learn more](../../azure-monitor/faq.md#how-do-i-move-an-application-insights-resource-to-a-new-region). |
> | actiongroups | No | > | activitylogalerts | No | > | alertrules | No |
Jump to a resource provider namespace:
> | diagnosticsettingscategories | No | > | eventcategories | No | > | eventtypes | No |
-> | extendeddiagnosticsettings | No | |
+> | extendeddiagnosticsettings | No |
> | guestdiagnosticsettings | No | > | listmigrationdate | No | > | logdefinitions | No | > | logprofiles | No |
-> | logs | No | No |
+> | logs | No |
> | metricalerts | No | > | metricbaselines | No | > | metricbatch | No |
Jump to a resource provider namespace:
> | networkwatchers / pingmeshes | No | > | p2svpngateways | No | > | privatednszones | No |
-> | privatednszones / virtualnetworklinks | No |> | privatednszonesinternal | No |
+> | privatednszones / virtualnetworklinks | No |
+> | privatednszonesinternal | No |
> | privateendpointredirectmaps | No | > | privateendpoints | No | > | privatelinkservices | No |
azure-resource-manager Bicep Modules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/bicep-modules.md
Title: Bicep modules description: Describes how to define and consume a module, and how to use module scopes. Previously updated : 03/25/2021 Last updated : 03/30/2021 # Use Bicep modules (Preview)
-Bicep enables you to break down a complex solution into modules. A Bicep module is a set of one or more resources to be deployed together. Modules abstract away complex details of the raw resource declaration, which can increase readability. You can reuse these modules, and share them with other people. Combined with [template specs](./template-specs.md), it creates a way for modularity and code reuse. For a tutorial, see [Tutorial: Add Bicep modules](./bicep-tutorial-add-modules.md).
+Bicep enables you to break down a complex solution into modules. A Bicep module is a set of one or more resources to be deployed together. Modules abstract away complex details of the raw resource declaration, which can increase readability. You can reuse these modules, and share them with other people. Combined with [template specs](./template-specs.md), it creates a way for modularity and code reuse. Bicep modules are transpiled into a single ARM template with [nested templates](./linked-templates.md#nested-template) for deployment. In Bicep, [_dependsOn_](./template-syntax.md#resources) gets handled automatically.
+
+For a tutorial, see [Tutorial: Add Bicep modules](./bicep-tutorial-add-modules.md).
## Define modules
azure-resource-manager Deployment Script Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deployment-script-template.md
Previously updated : 03/23/2021 Last updated : 03/30/2021
Property value details:
- `identity`: For deployment script API version 2020-10-01 or later, a user-assigned managed identity is optional unless you need to perform any Azure-specific actions in the script. For the API version 2019-10-01-preview, a managed identity is required as the deployment script service uses it to execute the scripts. Currently, only user-assigned managed identity is supported. - `kind`: Specify the type of script. Currently, Azure PowerShell and Azure CLI scripts are supported. The values are **AzurePowerShell** and **AzureCLI**. - `forceUpdateTag`: Changing this value between template deployments forces the deployment script to re-execute. If you use the `newGuid()` or the `utcNow()` functions, both functions can only be used in the default value for a parameter. To learn more, see [Run script more than once](#run-script-more-than-once).-- `containerSettings`: Specify the settings to customize Azure Container Instance. `containerGroupName` is for specifying the container group name. If not specified, the group name is automatically generated.-- `storageAccountSettings`: Specify the settings to use an existing storage account. If not specified, a storage account is automatically created. See [Use an existing storage account](#use-existing-storage-account).
+- `containerSettings`: Specify the settings to customize Azure Container Instance. Deployment script requires a new Azure Container Instance. You can't specify an existing Azure Container Instance. However, you can customize the container group name by using `containerGroupName`. If not specified, the group name is automatically generated.
+- `storageAccountSettings`: Specify the settings to use an existing storage account. If `containerGroupName` is not specified, a storage account is automatically created. See [Use an existing storage account](#use-existing-storage-account).
- `azPowerShellVersion`/`azCliVersion`: Specify the module version to be used. See a list of [supported Azure PowerShell versions](https://mcr.microsoft.com/v2/azuredeploymentscripts-powershell/tags/list). See a list of [supported Azure CLI versions](https://mcr.microsoft.com/v2/azure-cli/tags/list). >[!IMPORTANT]
azure-resource-manager Template Deploy What If https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/template-deploy-what-if.md
For more information about installing modules, see [Install Azure PowerShell](/p
## Install Azure CLI module
-To use what-if in Azure CLI, you must have Azure CLI 2.5.0 or later. If needed, [install the latest version of Azure CLI](/cli/azure/install-azure-cli).
+To use what-if in Azure CLI, you must have Azure CLI 2.14.0 or later. If needed, [install the latest version of Azure CLI](/cli/azure/install-azure-cli).
## See results
azure-resource-manager Test Cases https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/test-cases.md
# Default test cases for ARM template test toolkit
-This article describes the default tests that are run with the [template test toolkit](test-toolkit.md). It provides examples that pass or fail the test. It includes the name of each test.
+This article describes the default tests that are run with the [template test toolkit](test-toolkit.md) for Azure Resource Manager templates (ARM templates). It provides examples that pass or fail the test. It includes the name of each test. To run a specific test, see [Test parameters](test-toolkit.md#test-parameters).
## Use correct schema
The following example **passes** this test.
Test name: **Location Should Not Be Hardcoded**
-Your templates should have a parameter named location. Use this parameter for setting the location of resources in your template. In the main template (named azuredeploy.json or mainTemplate.json), this parameter can default to the resource group location. In linked or nested templates, the location parameter shouldn't have a default location.
+Your templates should have a parameter named location. Use this parameter for setting the location of resources in your template. In the main template (named _azuredeploy.json_ or _mainTemplate.json_), this parameter can default to the resource group location. In linked or nested templates, the location parameter shouldn't have a default location.
Users of your template may have limited regions available to them. When you hard code the resource location, users may be blocked from creating a resource in that region. Users could be blocked even if you set the resource location to `"[resourceGroup().location]"`. The resource group may have been created in a region that other users can't access. Those users are blocked from using the template.
When you include parameters for `_artifactsLocation` and `_artifactsLocationSasT
* if you provide one parameter, you must provide the other * `_artifactsLocation` must be a **string** * `_artifactsLocation` must have a default value in the main template
-* `_artifactsLocation` can't have a default value in a nested template
+* `_artifactsLocation` can't have a default value in a nested template
* `_artifactsLocation` must have either `"[deployment().properties.templateLink.uri]"` or the raw repo URL for its default value * `_artifactsLocationSasToken` must be a **secureString** * `_artifactsLocationSasToken` can only have an empty string for its default value
-* `_artifactsLocationSasToken` can't have a default value in a nested template
+* `_artifactsLocationSasToken` can't have a default value in a nested template
## Declared variables must be used
The next example **passes** this test.
Test name: **ResourceIds should not contain**
-When generating resource IDs, don't use unnecessary functions for optional parameters. By default, the [resourceId](template-functions-resource.md#resourceid) function uses the current subscription and resource group. You don't need to provide those values.
+When generating resource IDs, don't use unnecessary functions for optional parameters. By default, the [resourceId](template-functions-resource.md#resourceid) function uses the current subscription and resource group. You don't need to provide those values.
The following example **fails** this test, because you don't need to provide the current subscription ID and resource group name.
The following example **fails** because it uses a [list*](template-functions-res
} ```
+## Use protectedSettings for commandToExecute secrets
+
+Test name: **CommandToExecute Must Use ProtectedSettings For Secrets**
+
+In a Custom Script Extension, use the encrypted property `protectedSettings` when `commandToExecute` includes secret data such as a password. Examples of secret data types are `secureString`, `secureObject`, `list()` functions, or scripts.
+
+For more information about Custom Script Extension for virtual machines, see [Windows](
+/azure/virtual-machines/extensions/custom-script-windows), [Linux](/azure/virtual-machines/extensions/custom-script-linux), and the schema [Microsoft.Compute virtualMachines/extensions](/azure/templates/microsoft.compute/virtualmachines/extensions).
+
+In this example, a template with a parameter named `adminPassword` and type `secureString` **passes** the test because the encrypted property `protectedSettings` includes `commandToExecute`.
+
+```json
+"properties": [
+ {
+ "protectedSettings": {
+ "commandToExecute": "[parameters('adminPassword')]"
+ }
+ }
+]
+```
+
+The test **fails** if the unencrypted property `settings` includes `commandToExecute`.
+
+```json
+"properties": [
+ {
+ "settings": {
+ "commandToExecute": "[parameters('adminPassword')]"
+ }
+ }
+]
+```
+ ## Next steps -- To learn about running the test toolkit, see [Use ARM template test toolkit](test-toolkit.md).-- For a Microsoft Learn module that covers using the test toolkit, see [Preview changes and validate Azure resources by using what-if and the ARM template test toolkit](/learn/modules/arm-template-test/).
+* To learn about running the test toolkit, see [Use ARM template test toolkit](test-toolkit.md).
+* For a Microsoft Learn module that covers using the test toolkit, see [Preview changes and validate Azure resources by using what-if and the ARM template test toolkit](/learn/modules/arm-template-test/).
azure-sql Connectivity Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/connectivity-architecture.md
Details of how traffic shall be migrated to new Gateways in specific regions are
| China North 2 | 40.73.50.0 | | East Asia | 52.175.33.150, 13.75.32.4, 13.75.32.14 | | East US | 40.121.158.30, 40.79.153.12, 40.78.225.32 |
-| East US 2 | 40.79.84.180, 52.177.185.181, 52.167.104.0, 191.239.224.107, 104.208.150.3 |
+| East US 2 | 40.79.84.180, 52.177.185.181, 52.167.104.0, 191.239.224.107, 104.208.150.3, 40.70.144.193 |
| France Central | 40.79.137.0, 40.79.129.1, 40.79.137.8, 40.79.145.12 | | France South | 40.79.177.0, 40.79.177.10 ,40.79.177.12 | | Germany Central | 51.4.144.100 |
azure-sql Gateway Migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/gateway-migration.md
The most up-to-date information will be maintained in the [Azure SQL Database ga
# [In progress](#tab/in-progress-ip) ## April 2021
+New SQL Gateways are being added to the following regions:
+- East US 2: 40.70.144.193
+This SQL Gateway shall start accepting customer traffic on 30 April 2021.
+ New SQL Gateways are being added to the following regions: - Norway East: 51.120.96.33 - South East Asia: 13.67.16.193
azure-sql Resource Limits Dtu Elastic Pools https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/resource-limits-dtu-elastic-pools.md
Previously updated : 07/28/2020 Last updated : 03/30/2021 # Resources limits for elastic pools using the DTU purchasing model [!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)]
For the same number of DTUs, resources provided to an elastic pool may exceed th
| Max concurrent sessions per pool <sup>3</sup> | 30000 | 30000 | 30000 | 30000 | 30000 | 30000 | | Min DTU per database choices | 0, 10, 20, 50 | 0, 10, 20, 50, 100 | 0, 10, 20, 50, 100, 200 | 0, 10, 20, 50, 100, 200, 300 | 0, 10, 20, 50, 100, 200, 300, 400 | 0, 10, 20, 50, 100, 200, 300, 400, 800 | | Max DTU per database choices | 10, 20, 50 | 10, 20, 50, 100 | 10, 20, 50, 100, 200 | 10, 20, 50, 100, 200, 300 | 10, 20, 50, 100, 200, 300, 400 | 10, 20, 50, 100, 200, 300, 400, 800 |
-| Max storage per database (GB) | 500 | 750 | 1024 | 1024 | 1024 | 1024 |
+| Max storage per database (GB) | 1024 | 1024 | 1024 | 1024 | 1024 | 1024 |
|||||||| <sup>1</sup> See [SQL Database pricing options](https://azure.microsoft.com/pricing/details/sql-database/elastic/) for details on additional cost incurred due to any extra storage provisioned.
For the same number of DTUs, resources provided to an elastic pool may exceed th
| Max concurrent sessions per pool <sup>3</sup> | 30000 | 30000 | 30000 | 30000 | 30000 | | Min DTU per database choices | 0, 10, 20, 50, 100, 200, 300, 400, 800, 1200 | 0, 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600 | 0, 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600, 2000 | 0, 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600, 2000, 2500 | 0, 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600, 2000, 2500, 3000 | | Max DTU per database choices | 10, 20, 50, 100, 200, 300, 400, 800, 1200 | 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600 | 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600, 2000 | 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600, 2000, 2500 | 10, 20, 50, 100, 200, 300, 400, 800, 1200, 1600, 2000, 2500, 3000 |
-| Max storage per database (GB) | 1024 | 1024 | 1024 | 1024 | 1024 |
+| Max storage per database (GB) | 1024 | 1536 | 1792 | 2304 | 2816 |
||||||| <sup>1</sup> See [SQL Database pricing options](https://azure.microsoft.com/pricing/details/sql-database/elastic/) for details on additional cost incurred due to any extra storage provisioned.
For the same number of DTUs, resources provided to an elastic pool may exceed th
| Max concurrent sessions per pool <sup>3</sup> | 30000 | 30000 | 30000 | 30000 | 30000 | | Min eDTUs per database | 0, 25, 50, 75, 125 | 0, 25, 50, 75, 125, 250 | 0, 25, 50, 75, 125, 250, 500 | 0, 25, 50, 75, 125, 250, 500, 1000 | 0, 25, 50, 75, 125, 250, 500, 1000| | Max eDTUs per database | 25, 50, 75, 125 | 25, 50, 75, 125, 250 | 25, 50, 75, 125, 250, 500 | 25, 50, 75, 125, 250, 500, 1000 | 25, 50, 75, 125, 250, 500, 1000|
-| Max storage per database (GB) | 1024 | 1024 | 1024 | 1024 | 1024 |
+| Max storage per database (GB) | 1024 | 1024 | 1024 | 1024 | 1536 |
||||||| <sup>1</sup> See [SQL Database pricing options](https://azure.microsoft.com/pricing/details/sql-database/elastic/) for details on additional cost incurred due to any extra storage provisioned.
For the same number of DTUs, resources provided to an elastic pool may exceed th
| Max concurrent sessions per pool <sup>3</sup> | 30000 | 30000 | 30000 | 30000 | 30000 | | Min DTU per database choices | 0, 25, 50, 75, 125, 250, 500, 1000, 1750 | 0, 25, 50, 75, 125, 250, 500, 1000, 1750 | 0, 25, 50, 75, 125, 250, 500, 1000, 1750 | 0, 25, 50, 75, 125, 250, 500, 1000, 1750 | 0, 25, 50, 75, 125, 250, 500, 1000, 1750, 4000 | | Max DTU per database choices | 25, 50, 75, 125, 250, 500, 1000, 1750 | 25, 50, 75, 125, 250, 500, 1000, 1750 | 25, 50, 75, 125, 250, 500, 1000, 1750 | 25, 50, 75, 125, 250, 500, 1000, 1750 | 25, 50, 75, 125, 250, 500, 1000, 1750, 4000 |
-| Max storage per database (GB) | 1024 | 1024 | 1024 | 1024 | 1024 |
+| Max storage per database (GB) | 2048 | 2560 | 3072 | 3584 | 4096 |
||||||| <sup>1</sup> See [SQL Database pricing options](https://azure.microsoft.com/pricing/details/sql-database/elastic/) for details on additional cost incurred due to any extra storage provisioned.
azure-sql Sql Data Sync Sql Server Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/sql-data-sync-sql-server-configure.md
For PowerShell examples on how to configure SQL Data Sync, see [How to sync betw
| **Use private link** | Choose a service managed private endpoint to establish a secure connection between the sync service and the hub database. | > [!NOTE]
- > Microsoft recommends to create a new, empty database for use as the **Sync Metadata Database**. Data Sync creates tables in this database and runs a frequent workload. This database is shared as the **Sync Metadata Database** for all sync groups in a selected region and subscription. You can't change the database or its name without removing all sync groups and sync agents in the region.
+ > Microsoft recommends to create a new, empty database for use as the **Sync Metadata Database**. Data Sync creates tables in this database and runs a frequent workload. This database is shared as the **Sync Metadata Database** for all sync groups in a selected region and subscription. You can't change the database or its name without removing all sync groups and sync agents in the region. Additionally, an Elastic jobs database cannot be used as the SQL Data Sync Metadata database and vice versa.
Select **OK** and wait for the sync group to be created and deployed.
azure-sql Connect Application Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/connect-application-instance.md
Previously updated : 11/09/2018 Last updated : 02/25/2021 # Connect your application to Azure SQL Managed Instance
The following minimal versions of the tools and drivers are recommended if you w
## Next steps - For information about SQL Managed Instance, see [What is SQL Managed Instance?](sql-managed-instance-paas-overview.md).-- For a tutorial showing you how to create a new managed instance, see [Create a managed instance](instance-create-quickstart.md).
+- For a tutorial showing you how to create a new managed instance, see [Create a managed instance](instance-create-quickstart.md).
azure-sql Connectivity Architecture Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/connectivity-architecture-overview.md
Deploy SQL Managed Instance in a dedicated subnet inside the virtual network. Th
|Name|Address prefix|Next hop| |-|--|-| |subnet-to-vnetlocal|MI SUBNET|Virtual network|
-|mi-13-64-11-nexthop-internet|13.64.0.0/11|Internet|
-|mi-13-104-14-nexthop-internet|13.104.0.0/14|Internet|
-|mi-20-33-16-nexthop-internet|20.33.0.0/16|Internet|
-|mi-20-34-15-nexthop-internet|20.34.0.0/15|Internet|
-|mi-20-36-14-nexthop-internet|20.36.0.0/14|Internet|
-|mi-20-40-13-nexthop-internet|20.40.0.0/13|Internet|
-|mi-20-48-12-nexthop-internet|20.48.0.0/12|Internet|
-|mi-20-64-10-nexthop-internet|20.64.0.0/10|Internet|
-|mi-20-128-16-nexthop-internet|20.128.0.0/16|Internet|
-|mi-20-135-16-nexthop-internet|20.135.0.0/16|Internet|
-|mi-20-136-16-nexthop-internet|20.136.0.0/16|Internet|
-|mi-20-140-15-nexthop-internet|20.140.0.0/15|Internet|
-|mi-20-143-16-nexthop-internet|20.143.0.0/16|Internet|
-|mi-20-144-14-nexthop-internet|20.144.0.0/14|Internet|
-|mi-20-150-15-nexthop-internet|20.150.0.0/15|Internet|
-|mi-20-160-12-nexthop-internet|20.160.0.0/12|Internet|
-|mi-20-176-14-nexthop-internet|20.176.0.0/14|Internet|
-|mi-20-180-14-nexthop-internet|20.180.0.0/14|Internet|
-|mi-20-184-13-nexthop-internet|20.184.0.0/13|Internet|
-|mi-20-192-10-nexthop-internet|20.192.0.0/10|Internet|
-|mi-40-64-10-nexthop-internet|40.64.0.0/10|Internet|
-|mi-51-4-15-nexthop-internet|51.4.0.0/15|Internet|
-|mi-51-8-16-nexthop-internet|51.8.0.0/16|Internet|
-|mi-51-10-15-nexthop-internet|51.10.0.0/15|Internet|
-|mi-51-18-16-nexthop-internet|51.18.0.0/16|Internet|
-|mi-51-51-16-nexthop-internet|51.51.0.0/16|Internet|
-|mi-51-53-16-nexthop-internet|51.53.0.0/16|Internet|
-|mi-51-103-16-nexthop-internet|51.103.0.0/16|Internet|
-|mi-51-104-15-nexthop-internet|51.104.0.0/15|Internet|
-|mi-51-132-16-nexthop-internet|51.132.0.0/16|Internet|
-|mi-51-136-15-nexthop-internet|51.136.0.0/15|Internet|
-|mi-51-138-16-nexthop-internet|51.138.0.0/16|Internet|
-|mi-51-140-14-nexthop-internet|51.140.0.0/14|Internet|
-|mi-51-144-15-nexthop-internet|51.144.0.0/15|Internet|
-|mi-52-96-12-nexthop-internet|52.96.0.0/12|Internet|
-|mi-52-112-14-nexthop-internet|52.112.0.0/14|Internet|
-|mi-52-125-16-nexthop-internet|52.125.0.0/16|Internet|
-|mi-52-126-15-nexthop-internet|52.126.0.0/15|Internet|
-|mi-52-130-15-nexthop-internet|52.130.0.0/15|Internet|
-|mi-52-132-14-nexthop-internet|52.132.0.0/14|Internet|
-|mi-52-136-13-nexthop-internet|52.136.0.0/13|Internet|
-|mi-52-145-16-nexthop-internet|52.145.0.0/16|Internet|
-|mi-52-146-15-nexthop-internet|52.146.0.0/15|Internet|
-|mi-52-148-14-nexthop-internet|52.148.0.0/14|Internet|
-|mi-52-152-13-nexthop-internet|52.152.0.0/13|Internet|
-|mi-52-160-11-nexthop-internet|52.160.0.0/11|Internet|
-|mi-52-224-11-nexthop-internet|52.224.0.0/11|Internet|
-|mi-64-4-18-nexthop-internet|64.4.0.0/18|Internet|
-|mi-65-52-14-nexthop-internet|65.52.0.0/14|Internet|
-|mi-66-119-144-20-nexthop-internet|66.119.144.0/20|Internet|
-|mi-70-37-17-nexthop-internet|70.37.0.0/17|Internet|
-|mi-70-37-128-18-nexthop-internet|70.37.128.0/18|Internet|
-|mi-91-190-216-21-nexthop-internet|91.190.216.0/21|Internet|
-|mi-94-245-64-18-nexthop-internet|94.245.64.0/18|Internet|
-|mi-103-9-8-22-nexthop-internet|103.9.8.0/22|Internet|
-|mi-103-25-156-24-nexthop-internet|103.25.156.0/24|Internet|
-|mi-103-25-157-24-nexthop-internet|103.25.157.0/24|Internet|
-|mi-103-25-158-23-nexthop-internet|103.25.158.0/23|Internet|
-|mi-103-36-96-22-nexthop-internet|103.36.96.0/22|Internet|
-|mi-103-255-140-22-nexthop-internet|103.255.140.0/22|Internet|
-|mi-104-40-13-nexthop-internet|104.40.0.0/13|Internet|
-|mi-104-146-15-nexthop-internet|104.146.0.0/15|Internet|
-|mi-104-208-13-nexthop-internet|104.208.0.0/13|Internet|
-|mi-111-221-16-20-nexthop-internet|111.221.16.0/20|Internet|
-|mi-111-221-64-18-nexthop-internet|111.221.64.0/18|Internet|
-|mi-129-75-16-nexthop-internet|129.75.0.0/16|Internet|
-|mi-131-107-16-nexthop-internet|131.107.0.0/16|Internet|
-|mi-131-253-1-24-nexthop-internet|131.253.1.0/24|Internet|
-|mi-131-253-3-24-nexthop-internet|131.253.3.0/24|Internet|
-|mi-131-253-5-24-nexthop-internet|131.253.5.0/24|Internet|
-|mi-131-253-6-24-nexthop-internet|131.253.6.0/24|Internet|
-|mi-131-253-8-24-nexthop-internet|131.253.8.0/24|Internet|
-|mi-131-253-12-22-nexthop-internet|131.253.12.0/22|Internet|
-|mi-131-253-16-23-nexthop-internet|131.253.16.0/23|Internet|
-|mi-131-253-18-24-nexthop-internet|131.253.18.0/24|Internet|
-|mi-131-253-21-24-nexthop-internet|131.253.21.0/24|Internet|
-|mi-131-253-22-23-nexthop-internet|131.253.22.0/23|Internet|
-|mi-131-253-24-21-nexthop-internet|131.253.24.0/21|Internet|
-|mi-131-253-32-20-nexthop-internet|131.253.32.0/20|Internet|
-|mi-131-253-61-24-nexthop-internet|131.253.61.0/24|Internet|
-|mi-131-253-62-23-nexthop-internet|131.253.62.0/23|Internet|
-|mi-131-253-64-18-nexthop-internet|131.253.64.0/18|Internet|
-|mi-131-253-128-17-nexthop-internet|131.253.128.0/17|Internet|
-|mi-132-245-16-nexthop-internet|132.245.0.0/16|Internet|
-|mi-134-170-16-nexthop-internet|134.170.0.0/16|Internet|
-|mi-134-177-16-nexthop-internet|134.177.0.0/16|Internet|
-|mi-137-116-15-nexthop-internet|137.116.0.0/15|Internet|
-|mi-137-135-16-nexthop-internet|137.135.0.0/16|Internet|
-|mi-138-91-16-nexthop-internet|138.91.0.0/16|Internet|
-|mi-138-196-16-nexthop-internet|138.196.0.0/16|Internet|
-|mi-139-217-16-nexthop-internet|139.217.0.0/16|Internet|
-|mi-139-219-16-nexthop-internet|139.219.0.0/16|Internet|
-|mi-141-251-16-nexthop-internet|141.251.0.0/16|Internet|
-|mi-146-147-16-nexthop-internet|146.147.0.0/16|Internet|
-|mi-147-243-16-nexthop-internet|147.243.0.0/16|Internet|
-|mi-150-171-16-nexthop-internet|150.171.0.0/16|Internet|
-|mi-150-242-48-22-nexthop-internet|150.242.48.0/22|Internet|
-|mi-157-54-15-nexthop-internet|157.54.0.0/15|Internet|
-|mi-157-56-14-nexthop-internet|157.56.0.0/14|Internet|
-|mi-157-60-16-nexthop-internet|157.60.0.0/16|Internet|
-|mi-167-105-16-nexthop-internet|167.105.0.0/16|Internet|
-|mi-167-220-16-nexthop-internet|167.220.0.0/16|Internet|
-|mi-168-61-16-nexthop-internet|168.61.0.0/16|Internet|
-|mi-168-62-15-nexthop-internet|168.62.0.0/15|Internet|
-|mi-191-232-13-nexthop-internet|191.232.0.0/13|Internet|
-|mi-192-32-16-nexthop-internet|192.32.0.0/16|Internet|
-|mi-192-48-225-24-nexthop-internet|192.48.225.0/24|Internet|
-|mi-192-84-159-24-nexthop-internet|192.84.159.0/24|Internet|
-|mi-192-84-160-23-nexthop-internet|192.84.160.0/23|Internet|
-|mi-192-197-157-24-nexthop-internet|192.197.157.0/24|Internet|
-|mi-193-149-64-19-nexthop-internet|193.149.64.0/19|Internet|
-|mi-193-221-113-24-nexthop-internet|193.221.113.0/24|Internet|
-|mi-194-69-96-19-nexthop-internet|194.69.96.0/19|Internet|
-|mi-194-110-197-24-nexthop-internet|194.110.197.0/24|Internet|
-|mi-198-105-232-22-nexthop-internet|198.105.232.0/22|Internet|
-|mi-198-200-130-24-nexthop-internet|198.200.130.0/24|Internet|
-|mi-198-206-164-24-nexthop-internet|198.206.164.0/24|Internet|
-|mi-199-60-28-24-nexthop-internet|199.60.28.0/24|Internet|
-|mi-199-74-210-24-nexthop-internet|199.74.210.0/24|Internet|
-|mi-199-103-90-23-nexthop-internet|199.103.90.0/23|Internet|
-|mi-199-103-122-24-nexthop-internet|199.103.122.0/24|Internet|
-|mi-199-242-32-20-nexthop-internet|199.242.32.0/20|Internet|
-|mi-199-242-48-21-nexthop-internet|199.242.48.0/21|Internet|
-|mi-202-89-224-20-nexthop-internet|202.89.224.0/20|Internet|
-|mi-204-13-120-21-nexthop-internet|204.13.120.0/21|Internet|
-|mi-204-14-180-22-nexthop-internet|204.14.180.0/22|Internet|
-|mi-204-79-135-24-nexthop-internet|204.79.135.0/24|Internet|
-|mi-204-79-179-24-nexthop-internet|204.79.179.0/24|Internet|
-|mi-204-79-181-24-nexthop-internet|204.79.181.0/24|Internet|
-|mi-204-79-188-24-nexthop-internet|204.79.188.0/24|Internet|
-|mi-204-79-195-24-nexthop-internet|204.79.195.0/24|Internet|
-|mi-204-79-196-23-nexthop-internet|204.79.196.0/23|Internet|
-|mi-204-79-252-24-nexthop-internet|204.79.252.0/24|Internet|
-|mi-204-152-18-23-nexthop-internet|204.152.18.0/23|Internet|
-|mi-204-152-140-23-nexthop-internet|204.152.140.0/23|Internet|
-|mi-204-231-192-24-nexthop-internet|204.231.192.0/24|Internet|
-|mi-204-231-194-23-nexthop-internet|204.231.194.0/23|Internet|
-|mi-204-231-197-24-nexthop-internet|204.231.197.0/24|Internet|
-|mi-204-231-198-23-nexthop-internet|204.231.198.0/23|Internet|
-|mi-204-231-200-21-nexthop-internet|204.231.200.0/21|Internet|
-|mi-204-231-208-20-nexthop-internet|204.231.208.0/20|Internet|
-|mi-204-231-236-24-nexthop-internet|204.231.236.0/24|Internet|
-|mi-205-174-224-20-nexthop-internet|205.174.224.0/20|Internet|
-|mi-206-138-168-21-nexthop-internet|206.138.168.0/21|Internet|
-|mi-206-191-224-19-nexthop-internet|206.191.224.0/19|Internet|
-|mi-207-46-16-nexthop-internet|207.46.0.0/16|Internet|
-|mi-207-68-128-18-nexthop-internet|207.68.128.0/18|Internet|
-|mi-208-68-136-21-nexthop-internet|208.68.136.0/21|Internet|
-|mi-208-76-44-22-nexthop-internet|208.76.44.0/22|Internet|
-|mi-208-84-21-nexthop-internet|208.84.0.0/21|Internet|
-|mi-209-240-192-19-nexthop-internet|209.240.192.0/19|Internet|
-|mi-213-199-128-18-nexthop-internet|213.199.128.0/18|Internet|
-|mi-216-32-180-22-nexthop-internet|216.32.180.0/22|Internet|
-|mi-216-220-208-20-nexthop-internet|216.220.208.0/20|Internet|
-|mi-23-96-13-nexthop-internet|23.96.0.0/13|Internet|
-|mi-42-159-16-nexthop-internet|42.159.0.0/16|Internet|
-|mi-51-13-17-nexthop-internet|51.13.0.0/17|Internet|
-|mi-51-107-16-nexthop-internet|51.107.0.0/16|Internet|
-|mi-51-116-16-nexthop-internet|51.116.0.0/16|Internet|
-|mi-51-120-16-nexthop-internet|51.120.0.0/16|Internet|
-|mi-51-120-128-17-nexthop-internet|51.120.128.0/17|Internet|
-|mi-51-124-16-nexthop-internet|51.124.0.0/16|Internet|
-|mi-102-37-18-nexthop-internet|102.37.0.0/18|Internet|
-|mi-102-133-16-nexthop-internet|102.133.0.0/16|Internet|
-|mi-199-30-16-20-nexthop-internet|199.30.16.0/20|Internet|
-|mi-204-79-180-24-nexthop-internet|204.79.180.0/24|Internet|
+|mi-azurecloud-REGION-internet|AzureCloud.REGION|Internet|
+|mi-azurecloud-REGION_PAIR-internet|AzureCloud.REGION_PAIR|Internet|
+|mi-azuremonitor-internet|AzureMonitor|Internet|
+|mi-corpnetpublic-internet|CorpNetPublic|Internet|
+|mi-corpnetsaw-internet|CorpNetSaw|Internet|
+|mi-eventhub-REGION-internet|EventHub.REGION|Internet|
+|mi-eventhub-REGION_PAIR-internet|EventHub.REGION_PAIR|Internet|
+|mi-sqlmanagement-internet|SqlManagement|Internet|
+|mi-storage-internet|Storage|Internet|
+|mi-storage-REGION-internet|Storage.REGION|Internet|
+|mi-storage-REGION_PAIR-internet|Storage.REGION_PAIR|Internet|
|||| \* MI SUBNET refers to the IP address range for the subnet in the form x.x.x.x/y. You can find this information in the Azure portal, in subnet properties.
The following virtual network features are currently *not supported* with SQL Ma
- From the [Azure portal](instance-create-quickstart.md). - By using [PowerShell](scripts/create-configure-managed-instance-powershell.md). - By using [an Azure Resource Manager template](https://azure.microsoft.com/resources/templates/101-sqlmi-new-vnet/).
- - By using [an Azure Resource Manager template (using JumpBox, with SSMS included)](https://azure.microsoft.com/resources/templates/201-sqlmi-new-vnet-w-jumpbox/).
+ - By using [an Azure Resource Manager template (using JumpBox, with SSMS included)](https://azure.microsoft.com/resources/templates/201-sqlmi-new-vnet-w-jumpbox/).
azure-sql Sql Server To Sql Database Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/database/sql-server-to-sql-database-overview.md
Title: "SQL Server to Azure SQL Database: Migration overview"
-description: Learn about the different tools and options available to migrate your SQL Server databases to Azure SQL Database.
+description: Learn about the tools and options available to migrate your SQL Server databases to Azure SQL Database.
Last updated 11/06/2020
# Migration overview: SQL Server to Azure SQL Database [!INCLUDE[appliesto--sqldb](../../includes/appliesto-sqldb.md)]
-Learn about different migration options and considerations to migrate your SQL Server to Azure SQL Database.
+Learn about the options and considerations for migrating your SQL Server databases to Azure SQL Database.
-You can migrate SQL Server running on-premises or on:
+You can migrate SQL Server databases running on-premises or on:
-- SQL Server on Virtual Machines -- Amazon Web Services (AWS) EC2 -- Amazon Relational Database Service (AWS RDS) -- Compute Engine (Google Cloud Platform - GCP) -- Cloud SQL for SQL Server (Google Cloud Platform ΓÇô GCP)
+- SQL Server on Azure Virtual Machines.
+- Amazon Web Services (AWS) Elastic Compute Cloud (EC2).
+- AWS Relational Database Service (RDS).
+- Compute Engine in Google Cloud Platform (GCP).
+- Cloud SQL for SQL Server in GCP.
For other migration guides, see [Database Migration](https://docs.microsoft.com/data-migration). ## Overview
-[Azure SQL Database](../../database/sql-database-paas-overview.md) is a recommended target option for SQL Server workloads that require a fully managed Platform as a Service (PaaS). SQL Database handles most database management functions, along with high availability, intelligent query processing, scalability, and performance capabilities built in to suit many different application types.
+[Azure SQL Database](../../database/sql-database-paas-overview.md) is a recommended target option for SQL Server workloads that require a fully managed platform as a service (PaaS). SQL Database handles most database management functions. It also has built-in high availability, intelligent query processing, scalability, and performance capabilities to suit many application types.
SQL Database provides flexibility with multiple [deployment models](../../database/sql-database-paas-overview.md#deployment-models) and [service tiers](../../database/service-tiers-vcore.md#service-tiers) that cater to different types of applications or workloads.
-One of the key benefits of migrating to SQL Database is that you can modernize your application by leveraging the PaaS capabilities and eliminate any dependency on technical components that are scoped at the instance level such as SQL Agent jobs.
+One of the key benefits of migrating to SQL Database is that you can modernize your application by using the PaaS capabilities. You can then eliminate any dependency on technical components that are scoped at the instance level, such as SQL Agent jobs.
-You can also save on cost by migrating your SQL Server on-premises licenses to Azure SQL Database using the [Azure Hybrid Benefit](https://azure.microsoft.com/pricing/hybrid-benefit/) for SQL Server should you choose the [vCore-based purchasing model](../../database/service-tiers-vcore.md).
-
-This guide aims to clarify migration options and considerations as you prepare to migrate your SQL Server databases to Azure SQL Database.
+You can also save costs by using the [Azure Hybrid Benefit](https://azure.microsoft.com/pricing/hybrid-benefit/) for SQL Server to migrate your SQL Server on-premises licenses to Azure SQL Database. This option is available if you choose the [vCore-based purchasing model](../../database/service-tiers-vcore.md).
## Considerations
-The key factors to consider when evaluating migration options depend on:
+The key factors to consider when you're evaluating migration options are:
- Number of servers and databases - Size of databases - Acceptable business downtime during the migration process
-The migration options listed in this guide take these factors into account. For logical data migration to Azure SQL Database, the time to migrate can depend both on the number of objects in a database and the size of the database.
-
-Different tools are available for different workloads and user preferences. Some tools can be used to perform a quick migration of a single database using a UI-based tool while other tools can migrate multiple databases that can be automated to handle migrations at scale.
+The migration options listed in this guide take these factors into account. For logical data migration to Azure SQL Database, the time to migrate can depend on both the number of objects in a database and the size of the database.
-## Choose appropriate target
+Tools are available for various workloads and user preferences. Some tools can be used to perform a quick migration of a single database through a UI-based tool. Other tools can automate the migration of multiple databases to handle migrations at scale.
-Consider general guidelines to help you choose the right deployment model and service tier of Azure SQL Database. You can choose compute and storage resources during deployment and then [change them afterwards using the Azure portal](../../database/scale-resources.md) without incurring downtime for your application.
+## Choose an appropriate target
+Consider general guidelines to help you choose the right deployment model and service tier of Azure SQL Database. You can choose compute and storage resources during deployment and then [change them afterward by using the Azure portal](../../database/scale-resources.md) without incurring downtime for your application.
-**Deployment models**: Understand your application workload and the usage pattern to decide between a single database or elastic pool.
+**Deployment models**: Understand your application workload and the usage pattern to decide between a single database or an elastic pool.
-- A [single database](../../database/single-database-overview.md) represents a fully managed database suitable for most modern cloud applications and microservices.-- An [elastic pool](../../database/elastic-pool-overview.md) is a collection of single databases with a shared set of resources such as CPU or memory and suitable for combining databases in a pool with predictable usage patterns that can effectively share the same set of resources.
+- A [single database](../../database/single-database-overview.md) represents a fully managed database that's suitable for most modern cloud applications and microservices.
+- An [elastic pool](../../database/elastic-pool-overview.md) is a collection of single databases with a shared set of resources, such as CPU or memory. It's suitable for combining databases in a pool with predictable usage patterns that can effectively share the same set of resources.
-**Purchasing models**: Choose between the vCore, DTU, or serverless purchasing model.
+**Purchasing models**: Choose between the vCore, database transaction unit (DTU), or serverless purchasing models.
-- The [vCore model](../../database/service-tiers-vcore.md) lets you choose the number of vCores for your Azure SQL Database, making it the easiest choice when translating from on-premises SQL Server. This is the only option that supports saving on license cost with the [Azure Hybrid Benefit](https://azure.microsoft.com/pricing/hybrid-benefit/). -- The [DTU model](../../database/service-tiers-dtu.md) abstracts the underlying compute, memory, and IO resources in order to provide a blended DTU. -- The [serverless model](../../database/serverless-tier-overview.md) is intended for workloads that require automatic on-demand scaling with compute resources billed per second of usage. The serverless compute tier automatically pauses databases during inactive periods (where only storage is billed), and automatically resumes databases when activity returns.
+- The [vCore model](../../database/service-tiers-vcore.md) lets you choose the number of vCores for Azure SQL Database, so it's the easiest choice when you're translating from on-premises SQL Server. This is the only option that supports saving license costs with the [Azure Hybrid Benefit](https://azure.microsoft.com/pricing/hybrid-benefit/).
+- The [DTU model](../../database/service-tiers-dtu.md) abstracts the underlying compute, memory, and I/O resources to provide a blended DTU.
+- The [serverless model](../../database/serverless-tier-overview.md) is for workloads that require automatic on-demand scaling with compute resources billed per second of usage. The serverless compute tier automatically pauses databases during inactive periods (where only storage is billed). It automatically resumes databases when activity returns.
**Service tiers**: Choose between three service tiers designed for different types of applications. -- [General Purpose / Standard service tier](../../database/service-tier-general-purpose.md) offers a balanced budget-oriented option with compute and storage suitable to deliver mid-lower tier applications, with redundancy built in at the storage layer to recover from failures. Designed for most database workloads. -- [Business Critical / Premium service tier](../../database/service-tier-business-critical.md) is for high tier applications that require high transaction rates, low latency IO, and a high level of resiliency with secondary replicas available for both failover and to offload read workloads.-- [Hyperscale service tier](../../database/service-tier-hyperscale.md) is for databases that have growing data volumes and need to automatically scale up to 100-TB database size. Designed for very large databases.
+- [General Purpose/standard service tier](../../database/service-tier-general-purpose.md) offers a balanced budget-oriented option with compute and storage suitable to deliver applications in the middle and lower tiers. Redundancy is built in at the storage layer to recover from failures. It's designed for most database workloads.
+- [Business Critical/premium service tier](../../database/service-tier-business-critical.md) is for high-tier applications that require high transaction rates, low-latency I/O, and a high level of resiliency. Secondary replicas are available for failover and to offload read workloads.
+- [Hyperscale service tier](../../database/service-tier-hyperscale.md) is for databases that have growing data volumes and need to automatically scale up to 100 TB in database size. It's designed for very large databases.
> [!IMPORTANT]
-> [Transaction log rate is governed](../../database/resource-limits-logical-server.md#transaction-log-rate-governance) in Azure SQL Database to limit high ingestion rates. As such, during migration, it may be necessary to scale target database resources (vCores/DTUs) to ease pressure on CPU or throughput. Choose the appropriately-sized target database, but plan to scale resources up for the migration if necessary.
+> [Transaction log rate is governed](../../database/resource-limits-logical-server.md#transaction-log-rate-governance) in Azure SQL Database to limit high ingestion rates. As such, during migration, you might have to scale target database resources (vCores or DTUs) to ease pressure on CPU or throughput. Choose the appropriately sized target database, but plan to scale resources up for the migration if necessary.
-### SQL Server on Azure VM alternative
+### SQL Server VM alternative
-Your business may have requirements that make [SQL Server on Azure Virtual Machines](../../virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md) a more suitable target than Azure SQL Database.
+Your business might have requirements that make [SQL Server on Azure Virtual Machines](../../virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md) a more suitable target than Azure SQL Database.
-If the following apply to your business, consider moving to a SQL Server on Azure VM instead:
+If one of the following conditions applies to your business, consider moving to a SQL Server virtual machine (VM) instead:
-- If you require direct access to the operating system or file system, such as to install third-party or custom agents on the same virtual machine with SQL Server. -- If you have strict dependency on features that are still not supported, such as FileStream/FileTable, PolyBase, and cross-instance transactions. -- If you absolutely need to stay at a specific version of SQL Server (2012, for instance). -- If your compute requirements are much lower than managed instance offers (one vCore, for instance), and database consolidation is not an acceptable option.
+- You require direct access to the operating system or file system, such as to install third-party or custom agents on the same virtual machine with SQL Server.
+- You have strict dependency on features that are still not supported, such as FileStream/FileTable, PolyBase, and cross-instance transactions.
+- You need to stay at a specific version of SQL Server (2012, for example).
+- Your compute requirements are much lower than a managed instance offers (one vCore, for example), and database consolidation is not an acceptable option.
## Migration tools
-The recommended tools for migration are the Data Migration Assistant and the Azure Database Migration Service. There are other alternative migration options available as well.
-
-### Recommended tools
-
-The following table lists the recommended migration tools:
+We recommend the following migration tools:
|Technology | Description| |||
-| [Azure Migrate](../../../migrate/how-to-create-azure-sql-assessment.md) | Azure Migrate for Azure SQL allows you to discover and assess your SQL data estate at scale when on VMware, providing Azure SQL deployment recommendations, target sizing, and monthly estimates. |
-|[Data Migration Assistant (DMA)](/sql/dma/dma-migrateonpremsqltosqldb)|The Data Migration Assistant is a desktop tool that provides seamless assessments of SQL Server and migrations to Azure SQL Database (both schema and data). The tool can be installed on a server on-premises or on your local machine that has connectivity to your source databases. The migration process is a logical data movement between objects in the source and target database. </br> - Migrate single databases (both schema and data)|
-|[Azure Database Migration Service (DMS)](../../../dms/tutorial-sql-server-to-azure-sql.md)|A first party Azure service that can migrate your SQL Server databases to Azure SQL Database using the Azure portal or automated with PowerShell. Azure DMS requires you to select a preferred Azure Virtual Network (VNet) during provisioning to ensure there is connectivity to your source SQL Server databases. </br> - Migrate single databases or at scale. |
+| [Azure Migrate](../../../migrate/how-to-create-azure-sql-assessment.md) | This Azure service helps you discover and assess your SQL data estate at scale on VMware. It provides Azure SQL deployment recommendations, target sizing, and monthly estimates. |
+|[Data Migration Assistant](/sql/dma/dma-migrateonpremsqltosqldb)|This desktop tool from Microsoft provides seamless assessments of SQL Server and single-database migrations to Azure SQL Database (both schema and data). </br></br>The tool can be installed on a server on-premises or on your local machine that has connectivity to your source databases. The migration process is a logical data movement between objects in the source and target databases.|
+|[Azure Database Migration Service](../../../dms/tutorial-sql-server-to-azure-sql.md)|This Azure service can migrate SQL Server databases to Azure SQL Database through the Azure portal or automatically through PowerShell. Database Migration Service requires you to select a preferred Azure virtual network during provisioning to ensure connectivity to your source SQL Server databases. You can migrate single databases or at scale. |
| | |
-### Alternative tools
- The following table lists alternative migration tools: |Technology |Description | |||
-|[Transactional replication](../../database/replication-to-sql-database.md)|Replicate data from source SQL Server database table(s) to SQL Database by providing a publisher-subscriber type migration option while maintaining transactional consistency. Incremental data changes are propagated to Subscribers as they occur on the Publishers.|
-|[Import Export Service / BACPAC](../../database/database-import.md)|[BACPAC](/sql/relational-databases/data-tier-applications/data-tier-applications#bacpac) is a Windows file with a `.bacpac` extension that encapsulates a database's schema and data. BACPAC can be used to both export data from a source SQL Server and to import the data into Azure SQL Database. BACPAC file can be imported to a new Azure SQL Database using the Azure portal. </br></br> For scale and performance with large databases sizes or large number of databases, you should consider using the [SqlPackage](../../database/database-import.md#using-sqlpackage) command-line utility to export and import databases.|
-|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)|The [bulk copy program (bcp) utility](/sql/tools/bcp-utility) copies data from an instance of SQL Server into a data file. Use the BCP utility to export the data from your source and import the data file into the target SQL Database. </br></br> For high-speed bulk copy operations to move data to Azure SQL Database, [Smart Bulk Copy tool](/samples/azure-samples/smartbulkcopy/smart-bulk-copy/) can be used to maximize transfer speed by leveraging parallel copy tasks.|
-|[Azure Data Factory (ADF)](../../../data-factory/connector-azure-sql-database.md)|The [Copy activity](../../../data-factory/copy-activity-overview.md) in Azure Data Factory migrates data from source SQL Server database(s) to SQL Database using built-in connectors and an [Integration Runtime](../../../data-factory/concepts-integration-runtime.md).</br> </br> ADF supports a wide range of [connectors](../../../data-factory/connector-overview.md) to move data from SQL Server sources to SQL Database.|
-|[SQL Data Sync](../../database/sql-data-sync-data-sql-server-sql-database.md)|SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple databases, both on-premises and in the cloud.</br>Data Sync is useful in cases where data needs to be kept updated across several databases in Azure SQL Database or SQL Server.|
+|[Transactional replication](../../database/replication-to-sql-database.md)|Replicate data from source SQL Server database tables to Azure SQL Database by providing a publisher-subscriber type migration option while maintaining transactional consistency. Incremental data changes are propagated to subscribers as they occur on the publishers.|
+|[Import Export Service/BACPAC](../../database/database-import.md)|[BACPAC](/sql/relational-databases/data-tier-applications/data-tier-applications#bacpac) is a Windows file with a .bacpac extension that encapsulates a database's schema and data. You can use BACPAC to both export data from a SQL Server source and import the data into Azure SQL Database. A BACPAC file can be imported to a new SQL database through the Azure portal. </br></br> For scale and performance with large databases sizes or a large number of databases, consider using the [SqlPackage](../../database/database-import.md#using-sqlpackage) command-line tool to export and import databases.|
+|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)|The [bulk copy program (bcp) tool](/sql/tools/bcp-utility) copies data from an instance of SQL Server into a data file. Use the tool to export the data from your source and import the data file into the target SQL database. </br></br> For high-speed bulk copy operations to move data to Azure SQL Database, you can use the [Smart Bulk Copy tool](/samples/azure-samples/smartbulkcopy/smart-bulk-copy/) to maximize transfer speed by taking advantage of parallel copy tasks.|
+|[Azure Data Factory](../../../data-factory/connector-azure-sql-database.md)|The [Copy activity](../../../data-factory/copy-activity-overview.md) in Azure Data Factory migrates data from source SQL Server databases to Azure SQL Database by using built-in connectors and an [integration runtime](../../../data-factory/concepts-integration-runtime.md).</br> </br> Data Factory supports a wide range of [connectors](../../../data-factory/connector-overview.md) to move data from SQL Server sources to Azure SQL Database.|
+|[SQL Data Sync](../../database/sql-data-sync-data-sql-server-sql-database.md)|SQL Data Sync is a service built on Azure SQL Database that lets you synchronize selected data bidirectionally across multiple databases, both on-premises and in the cloud.</br>Data Sync is useful in cases where data needs to be kept updated across several databases in Azure SQL Database or SQL Server.|
| | | ## Compare migration options
-Compare migration options to choose the path appropriate to your business needs.
-
-### Recommended options
+Compare migration options to choose the path that's appropriate to your business needs.
-The following table compares the recommended migration options:
+The following table compares the migration options that we recommend:
|Migration option |When to use |Considerations | ||||
-|[Data Migration Assistant (DMA)](/sql/dma/dma-migrateonpremsqltosqldb) | - Migrate single databases (both schema and data). </br> - Can accommodate downtime during the data migration process. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migration activity performs data movement between database objects (from source to target) and hence recommended to run during off-peak times. </br> - DMA reports the status of migration per database object including the number of rows migrated. </br> - For large migrations (number of databases / size of database), use the Azure Database Migration Service listed below.|
-|[Azure Database Migration Service (DMS)](../../../dms/tutorial-sql-server-to-azure-sql.md)| - Migrate single databases or at scale. </br> - Can accommodate downtime during migration process. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migrations at scale can be automated via [PowerShell](../../../dms/howto-sql-server-to-azure-sql-powershell.md). </br> - Time to complete migration is dependent on database size and the number of objects in the database. </br> - Requires the source database to set as Read-Only. |
+|[Data Migration Assistant](/sql/dma/dma-migrateonpremsqltosqldb) | - Migrate single databases (both schema and data). </br> - Can accommodate downtime during the data migration process. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migration activity performs data movement between database objects (from source to target), so we recommend that you run it during off-peak times. </br> - Data Migration Assistant reports the status of migration per database object, including the number of rows migrated. </br> - For large migrations (number of databases or size of database), use Azure Database Migration Service.|
+|[Azure Database Migration Service](../../../dms/tutorial-sql-server-to-azure-sql.md)| - Migrate single databases or at scale. </br> - Can accommodate downtime during the migration process. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migrations at scale can be automated via [PowerShell](../../../dms/howto-sql-server-to-azure-sql-powershell.md). </br> - Time to complete migration depends on database size and the number of objects in the database. </br> - Requires the source database to be set as read-only. |
| | | |
-### Alternative options
- The following table compares the alternative migration options:
-|Method / technology |When to use |Considerations |
+|Method or technology |When to use |Considerations |
||||
-|[Transactional replication](../../database/replication-to-sql-database.md)| - Migrate by continuously publishing changes from source database tables to target SQL Database tables. </br> - Full or partial database migrations of selected tables (subset of database). </br> </br> Supported sources: </br> - [SQL Server (2016 - 2019) with some limitations](/sql/relational-databases/replication/replication-backward-compatibility) </br> - AWS EC2 </br> - GCP Compute SQL Server VM | - Setup is relatively complex compared to other migration options. </br> - Provides a continuous replication option to migrate data (without taking the databases offline). </br> - Transactional replication has a number of limitations to consider when setting up the Publisher on the source SQL Server. See [Limitations on Publishing Objects](/sql/relational-databases/replication/publish/publish-data-and-database-objects#limitations-on-publishing-objects) to learn more. </br>- It is possible to [monitor replication activity](/sql/relational-databases/replication/monitor/monitoring-replication). |
-|[Import Export Service / BACPAC](../../database/database-import.md)| - Migrate individual Line-of-business application databases. </br>- Suited for smaller databases. </br> - Does not require a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Requires downtime as data needs to be exported at the source and imported at the destination. </br> - The file formats and data types used in the export / import need to be consistent with table schemas to avoid truncation / data type mismatch errors. </br> - Time taken to export a database with a large number of objects can be significantly higher. |
-|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)| - Migrate full or partial data migrations. </br> - Can accommodate downtime. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Requires downtime for exporting data from source and importing into target. </br> - The file formats and data types used in the export / import need to be consistent with table schemas. |
-|[Azure Data Factory (ADF)](../../../data-factory/connector-azure-sql-database.md)| - Migrate and/or transforming data from source SQL Server databases. </br> - Merging data from multiple sources of data to Azure SQL Database typically for Business Intelligence (BI) workloads. | - Requires creating data movement pipelines in ADF to move data from source to destination. </br> - [Cost](https://azure.microsoft.com/pricing/details/data-factory/data-pipeline/) is an important consideration and is based on the pipeline triggers, activity runs, duration of data movement, etc. |
-|[SQL Data Sync](../../database/sql-data-sync-data-sql-server-sql-database.md)| - Synchronize data between source and target databases.</br> - Suitable to run continuous sync between Azure SQL Database and on-premises SQL Server in a bi-directional flow. | - Azure SQL Database must be the Hub database for sync with on-prem SQL Server database as Member database.</br> - Compared to Transactional Replication, SQL Data Sync supports bi-directional data sync between on-premises and Azure SQL Database. </br> - Can have a higher performance impact depending on the workload.|
+|[Transactional replication](../../database/replication-to-sql-database.md)| - Migrate by continuously publishing changes from source database tables to target SQL Database tables. </br> - Do full or partial database migrations of selected tables (subset of a database). </br> </br> Supported sources: </br> - [SQL Server (2016 to 2019) with some limitations](/sql/relational-databases/replication/replication-backward-compatibility) </br> - AWS EC2 </br> - GCP Compute SQL Server VM | - Setup is relatively complex compared to other migration options. </br> - Provides a continuous replication option to migrate data (without taking the databases offline). </br> - Transactional replication has limitations to consider when you're setting up the publisher on the source SQL Server instance. See [Limitations on publishing objects](/sql/relational-databases/replication/publish/publish-data-and-database-objects#limitations-on-publishing-objects) to learn more. </br>- It's possible to [monitor replication activity](/sql/relational-databases/replication/monitor/monitoring-replication). |
+|[Import Export Service/BACPAC](../../database/database-import.md)| - Migrate individual line-of-business application databases. </br>- Suited for smaller databases. </br> - Does not require a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Requires downtime because data needs to be exported at the source and imported at the destination. </br> - The file formats and data types used in the export or import need to be consistent with table schemas to avoid truncation or data-type mismatch errors. </br> - Time taken to export a database with a large number of objects can be significantly higher. |
+|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)| - Do full or partial data migrations. </br> - Can accommodate downtime. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Requires downtime for exporting data from the source and importing into the target. </br> - The file formats and data types used in the export or import need to be consistent with table schemas. |
+|[Azure Data Factory](../../../data-factory/connector-azure-sql-database.md)| - Migrate and/or transform data from source SQL Server databases. </br> - Merging data from multiple sources of data to Azure SQL Database is typically for business intelligence (BI) workloads. | - Requires creating data movement pipelines in Data Factory to move data from source to destination. </br> - [Cost](https://azure.microsoft.com/pricing/details/data-factory/data-pipeline/) is an important consideration and is based on factors like pipeline triggers, activity runs, and duration of data movement. |
+|[SQL Data Sync](../../database/sql-data-sync-data-sql-server-sql-database.md)| - Synchronize data between source and target databases.</br> - Suitable to run continuous sync between Azure SQL Database and on-premises SQL Server in a bidirectional flow. | - Azure SQL Database must be the hub database for sync with an on-premises SQL Server database as a member database.</br> - Compared to transactional replication, SQL Data Sync supports bidirectional data sync between on-premises and Azure SQL Database. </br> - Can have a higher performance impact, depending on the workload.|
| | | | ## Feature interoperability
-There are additional considerations when migrating workloads that rely on other SQL Server features.
+There are more considerations when you're migrating workloads that rely on other SQL Server features.
-#### SQL Server Integration Services
-Migrate SQL Server Integration Services (SSIS) packages to Azure by redeploying the packages to Azure-SSIS runtime in [Azure Data Factory](../../../data-factory/introduction.md). Azure Data Factory [supports migration of SSIS packages](../../../data-factory/scenario-ssis-migration-overview.md#azure-sql-database-as-database-workload-destination) by providing a runtime built to execute SSIS packages in Azure. Alternatively, you can also rewrite the SSIS ETL logic natively in ADF using [Dataflows](../../../data-factory/concepts-data-flow-overview.md).
+### SQL Server Integration Services
+Migrate SQL Server Integration Services (SSIS) packages to Azure by redeploying the packages to the Azure-SSIS runtime in [Azure Data Factory](../../../data-factory/introduction.md). Azure Data Factory [supports migration of SSIS packages](../../../data-factory/scenario-ssis-migration-overview.md#azure-sql-database-as-database-workload-destination) by providing a runtime built to run SSIS packages in Azure. Alternatively, you can rewrite the SSIS ETL (extract, transform, load) logic natively in Azure Data Factory by using [data flows](../../../data-factory/concepts-data-flow-overview.md).
-#### SQL Server Reporting Services
-Migrate SQL Server Reporting Services (SSRS) reports to paginated reports in Power BI. Use theΓÇ»[RDL Migration Tool](https://github.com/microsoft/RdlMigration) to help prepare and migrate your reports. This tool was developed by Microsoft to help customers migrate RDL reports from their SSRS servers to Power BI. It is available on GitHub, and it documents an end-to-end walkthrough of the migration scenario.
+### SQL Server Reporting Services
+Migrate SQL Server Reporting Services (SSRS) reports to paginated reports in Power BI. Use theΓÇ»[RDL Migration Tool](https://github.com/microsoft/RdlMigration) to help prepare and migrate your reports. Microsoft developed this tool to help customers migrate Report Definition Language (RDL) reports from their SSRS servers to Power BI. It's available on GitHub, and it documents an end-to-end walkthrough of the migration scenario.
-#### High availability
-Manual setup of SQL Server high availability features like Always On failover cluster instances and Always On availability groups become obsolete on the target Azure SQL Database as high availability architecture is already built into both [General Purpose (standard availability model)](../../database/high-availability-sla.md#basic-standard-and-general-purpose-service-tier-locally-redundant-availability) and [Business Critical (premium availability model)](../../database/high-availability-sla.md#premium-and-business-critical-service-tier-locally-redundant-availability) SQL Database. The Business Critical / Premium Service Tier also provides read scale-out that allows connecting into one of the secondary nodes for read-only purposes.
+### High availability
+Manual setup of SQL Server high-availability features like Always On failover cluster instances and Always On availability groups becomes obsolete on the target SQL database. High-availability architecture is already built into both [General Purpose (standard availability model)](../../database/high-availability-sla.md#basic-standard-and-general-purpose-service-tier-locally-redundant-availability) and [Business Critical (premium availability model)](../../database/high-availability-sla.md#premium-and-business-critical-service-tier-locally-redundant-availability) service tiers for Azure SQL Database. The Business Critical/premium service tier also provides read scale-out that allows connecting into one of the secondary nodes for read-only purposes.
-Beyond the high availability architecture that is included in SQL Database, there is also the [auto-failover groups](../../database/auto-failover-group-overview.md) feature that allows you to manage the replication and failover of databases in a managed instance to another region.
+Beyond the high-availability architecture that's included in Azure SQL Database, the [auto-failover groups](../../database/auto-failover-group-overview.md) feature allows you to manage the replication and failover of databases in a managed instance to another region.
-#### SQL Agent jobs
-SQL Agent jobs are not directly supported in Azure SQL Database and will need to be deployed to [Elastic Database Jobs (Preview)](../../database/job-automation-overview.md).
+### SQL Agent jobs
+SQL Agent jobs are not directly supported in Azure SQL Database and need to be deployed to [elastic database jobs (preview)](../../database/job-automation-overview.md).
-#### Logins and groups
-Move SQL logins from the source SQL Server to Azure SQL Database using Database Migration Service (DMS) in offline mode. Use the **Selected logins** blade in the **Migration Wizard** to migrate logins to your target SQL Database.
+### Logins and groups
+Move SQL logins from the SQL Server source to Azure SQL Database by using Database Migration Service in offline mode. Use the **Selected logins** pane in the Migration Wizard to migrate logins to your target SQL database.
-Windows users and groups can also be migrated using DMS by enabling the corresponding toggle button in the DMS Configuration page.
+You can also migrate Windows users and groups via Database Migration Service by enabling the corresponding toggle on the Database Migration Service **Configuration** page.
-Alternatively, you can use the [PowerShell utility tool](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins) specially designed by the Microsoft Data Migration Architects. The utility uses PowerShell to create a Transact-SQL (T-SQL) script to recreate logins and select database users from the source to the target. The tool automatically maps Windows AD accounts to Azure AD accounts, and can do a UPN lookup for each login against the source Active Directory. The tool scripts custom server and database roles, as well as role membership, database role, and user permissions. Contained databases are not yet supported and only a subset of possible SQL Server permissions are scripted.
+Alternatively, you can use the [PowerShell utility](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins) specially designed by Microsoft data migration architects. The utility uses PowerShell to create a Transact-SQL (T-SQL) script to re-create logins and select database users from the source to the target.
+The PowerShell utility automatically maps Windows Server Active Directory accounts to Azure Active Directory (Azure AD) accounts, and it can do a UPN lookup for each login against the source Active Directory instance. The utility scripts custom server and database roles, along with role membership and user permissions. Contained databases are not yet supported, and only a subset of possible SQL Server permissions are scripted.
-#### System databases
+### System databases
For Azure SQL Database, the only applicable system databases are [master](/sql/relational-databases/databases/master-database) and tempdb. To learn more, see [Tempdb in Azure SQL Database](/sql/relational-databases/databases/tempdb-database#tempdb-database-in-sql-database).
-## Leverage advanced features
+## Advanced features
-Be sure to take advantage of the advanced cloud-based features offered by SQL Database. For example, you no longer need to worry about managing backups as the service does it for you. You can restore to any [point in time within the retention period](../../database/recovery-using-backups.md#point-in-time-restore).
+Be sure to take advantage of the advanced cloud-based features in SQL Database. For example, you don't need to worry about managing backups because the service does it for you. You can restore to any [point in time within the retention period](../../database/recovery-using-backups.md#point-in-time-restore).
-To strengthen security, consider usingΓÇ»[Azure Active Directory Authentication](../../database/authentication-aad-overview.md), [auditing](../../database/auditing-overview.md),ΓÇ»[threat detection](../../database/azure-defender-for-sql.md),ΓÇ»[row-level security](/sql/relational-databases/security/row-level-security), andΓÇ»[dynamic data masking](/sql/relational-databases/security/dynamic-data-masking).
+To strengthen security, consider usingΓÇ»[Azure AD authentication](../../database/authentication-aad-overview.md), [auditing](../../database/auditing-overview.md),ΓÇ»[threat detection](../../database/azure-defender-for-sql.md),ΓÇ»[row-level security](/sql/relational-databases/security/row-level-security), andΓÇ»[dynamic data masking](/sql/relational-databases/security/dynamic-data-masking).
-In addition to advanced management and security features, SQL Database provides a set of advanced tools that can help you [monitor and tune your workload](../../database/monitor-tune-overview.md). [Azure SQL Analytics (Preview)](../../../azure-monitor/insights/azure-sql.md) is an advanced cloud monitoring solution for monitoring performance of all of your databases in Azure SQL Database at scale and across multiple subscriptions in a single view. Azure SQL Analytics collects and visualizes key performance metrics with built-in intelligence for performance troubleshooting.
+In addition to advanced management and security features, SQL Database provides tools that can help you [monitor and tune your workload](../../database/monitor-tune-overview.md). [Azure SQL Analytics (Preview)](../../../azure-monitor/insights/azure-sql.md) is an advanced solution for monitoring the performance of all of your databases in Azure SQL Database at scale and across multiple subscriptions in a single view. Azure SQL Analytics collects and visualizes key performance metrics with built-in intelligence for performance troubleshooting.
-[Automatic tuning](/sql/relational-databases/automatic-tuning/automatic-tuning#automatic-plan-correction) continuously monitors performance of your SQL execution plan statistics and automatically fixes identified performance issues.
+[Automatic tuning](/sql/relational-databases/automatic-tuning/automatic-tuning#automatic-plan-correction) continuously monitors performance of your SQL execution plan and automatically fixes identified performance issues.
## Migration assets
-For additional assistance, see the following resources that were developed for real world migration projects.
+For more assistance, see the following resources that were developed for real-world migration projects.
|Asset |Description | |||
-|[Data workload assessment model and tool](https://github.com/Microsoft/DataMigrationTeam/tree/master/Data%20Workload%20Assessment%20Model%20and%20Tool)| This tool provides suggested "best fit" target platforms, cloud readiness, and application/database remediation level for a given workload. It offers simple, one-click calculation and report generation that helps to accelerate large estate assessments by providing an automated and uniform target platform decision process.|
-|[DBLoader Utility](https://github.com/microsoft/DataMigrationTeam/tree/master/DBLoader%20Utility)|The DBLoader can be used to load data from delimited text files into SQL Server. This Windows console utility uses the SQL Server native client bulkload interface, which works on all versions of SQL Server, including Azure SQL Database.|
-|[Bulk database creation with PowerShell](https://github.com/Microsoft/DataMigrationTeam/tree/master/Bulk%20Database%20Creation%20with%20PowerShell)|This includes a set of three PowerShell scripts that create a resource group (create_rg.ps1), the [logical server in Azure](../../database/logical-servers.md) (create_sqlserver.ps1), and Azure SQL Database (create_sqldb.ps1). The scripts include loop capabilities so you can iterate and create as many servers and databases as necessary.|
-|[Bulk schema deployment with MSSQL-Scripter & PowerShell](https://github.com/Microsoft/DataMigrationTeam/tree/master/Bulk%20Schema%20Deployment%20with%20MSSQL-Scripter%20&%20PowerShell)|This asset creates a resource group, one or multiple [logical servers in Azure](../../database/logical-servers.md) to host Azure SQL Database, exports every schema from an on-premises SQL Server (or multiple SQL Servers (2005+) and imports it to Azure SQL Database.|
-|[Convert SQL Server Agent jobs into Elastic Database Jobs](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Convert%20SQL%20Server%20Agent%20Jobs%20into%20Elastic%20Database%20Jobs)|This script migrates your source SQL Server Agent jobs to Elastic Database Jobs.|
-|[Send mails from Azure SQL Database](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/AF%20SendMail)|This provides a solution as an alternative to SendMail capability that is available in on-premises SQL Server. The solution uses Azure Functions and the Azure SendGrid service to send emails from Azure SQL Database.|
-|[Utility to move on-premises SQL Server logins to Azure SQL Database](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins)|A PowerShell script that creates a T-SQL command script to re-create logins and select database users from on-premises SQL Server to Azure SQL Database. The tool allows automatic mapping of Windows AD accounts to Azure AD accounts as well as optionally migrating SQL Server native logins.|
-|[PerfMon data collection automation using Logman](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Perfmon%20Data%20Collection%20Automation%20Using%20Logman)|A tool that collects PerMon data to understand baseline performance and assists in migration target recommendations. This tool uses logman.exe to create the command that will create, start, stop, and delete performance counters set on a remote SQL Server|
-|[Whitepaper - Database migration to Azure SQL DB using BACPAC](https://github.com/microsoft/DataMigrationTeam/blob/master/Whitepapers/Database%20migrations%20-%20Benchmarks%20and%20Steps%20to%20Import%20to%20Azure%20SQL%20DB%20Single%20Database%20from%20BACPAC.pdf)|This whitepaper provides guidance and steps to help accelerate migrations from SQL Server to Azure SQL Database using BACPAC files.|
+|[Data workload assessment model and tool](https://github.com/Microsoft/DataMigrationTeam/tree/master/Data%20Workload%20Assessment%20Model%20and%20Tool)| This tool provides suggested "best fit" target platforms, cloud readiness, and an application/database remediation level for a workload. It offers simple, one-click calculation and report generation that helps to accelerate large estate assessments by providing an automated and uniform decision process for target platforms.|
+|[DBLoader utility](https://github.com/microsoft/DataMigrationTeam/tree/master/DBLoader%20Utility)|You can use DBLoader to load data from delimited text files into SQL Server. This Windows console utility uses the SQL Server native client bulk-load interface. The interface works on all versions of SQL Server, along with Azure SQL Database.|
+|[Bulk database creation with PowerShell](https://github.com/Microsoft/DataMigrationTeam/tree/master/Bulk%20Database%20Creation%20with%20PowerShell)|You can use a set of three PowerShell scripts that create a resource group (create_rg.ps1), the [logical server in Azure](../../database/logical-servers.md) (create_sqlserver.ps1), and a SQL database (create_sqldb.ps1). The scripts include loop capabilities so you can iterate and create as many servers and databases as necessary.|
+|[Bulk schema deployment with MSSQL-Scripter and PowerShell](https://github.com/Microsoft/DataMigrationTeam/tree/master/Bulk%20Schema%20Deployment%20with%20MSSQL-Scripter%20&%20PowerShell)|This asset creates a resource group, creates one or multiple [logical servers in Azure](../../database/logical-servers.md) to host Azure SQL Database, exports every schema from an on-premises SQL Server instance (or multiple SQL Server 2005+ instances), and imports the schemas to Azure SQL Database.|
+|[Convert SQL Server Agent jobs into elastic database jobs](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Convert%20SQL%20Server%20Agent%20Jobs%20into%20Elastic%20Database%20Jobs)|This script migrates your source SQL Server Agent jobs to elastic database jobs.|
+|[Send emails from Azure SQL Database](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/AF%20SendMail)|This solution is an alternative to SendMail capability and is available for on-premises SQL Server. It uses Azure Functions and the SendGrid service to send emails from Azure SQL Database.|
+|[Utility to move on-premises SQL Server logins to Azure SQL Database](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins)|A PowerShell script can create a T-SQL command script to re-create logins and select database users from on-premises SQL Server to Azure SQL Database. The tool allows automatic mapping of Windows Server Active Directory accounts to Azure AD accounts, along with optionally migrating SQL Server native logins.|
+|[Perfmon data collection automation by using Logman](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Perfmon%20Data%20Collection%20Automation%20Using%20Logman)|You can use the Logman tool to collect Perfmon data (to help you understand baseline performance) and get migration target recommendations. This tool uses logman.exe to create the command that will create, start, stop, and delete performance counters set on a remote SQL Server instance.|
+|[Database migration to Azure SQL Database by using BACPAC](https://github.com/microsoft/DataMigrationTeam/blob/master/Whitepapers/Database%20migrations%20-%20Benchmarks%20and%20Steps%20to%20Import%20to%20Azure%20SQL%20DB%20Single%20Database%20from%20BACPAC.pdf)|This white paper provides guidance and steps to help accelerate migrations from SQL Server to Azure SQL Database by using BACPAC files.|
The Data SQL Engineering team developed these resources. This team's core charter is to unblock and accelerate complex modernization for data platform migration projects to Microsoft's Azure data platform. ## Next steps
-To start migrating your SQL Server to SQL Database, see the [SQL Server to Azure SQL Database migration guide](sql-server-to-sql-database-guide.md).
+- To start migrating your SQL Server databases to Azure SQL Database, see the [SQL Server to Azure SQL Database migration guide](sql-server-to-sql-database-guide.md).
-- For a matrix of the Microsoft and third-party services and tools that are available to assist you with various database and data migration scenarios as well as specialty tasks, see [Service and tools for data migration](../../../dms/dms-tools-matrix.md).
+- For a matrix of services and tools that can help you with database and data migration scenarios as well as specialty tasks, see [Services and tools for data migration](../../../dms/dms-tools-matrix.md).
-- To learn more about SQL Database see:
+- To learn more about SQL Database, see:
- [Overview of Azure SQL Database](../../database/sql-database-paas-overview.md)
- - [Azure total Cost of Ownership Calculator](https://azure.microsoft.com/pricing/tco/calculator/)
+ - [Azure Total Cost of Ownership Calculator](https://azure.microsoft.com/pricing/tco/calculator/)
-- To learn more about the framework and adoption cycle for Cloud migrations, see
+- To learn more about the framework and adoption cycle for cloud migrations, see:
- [Cloud Adoption Framework for Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/contoso-migration-scale)
- - [Best practices for costing and sizing workloads migrate to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs)
+ - [Best practices for costing and sizing workloads migrated to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs)
+- To assess the application access layer, see [Data Access Migration Toolkit (Preview)](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit).
-- To assess the Application access layer, see [Data Access Migration Toolkit (Preview)](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit)-- For details on how to perform Data Access Layer A/B testing see [Database Experimentation Assistant](/sql/dea/database-experimentation-assistant-overview).
+- For details on how to perform A/B testing for the data access layer, see [Database Experimentation Assistant](/sql/dea/database-experimentation-assistant-overview).
azure-sql Sql Server To Managed Instance Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/managed-instance/sql-server-to-managed-instance-overview.md
Title: "SQL Server to Azure SQL Managed Instance: Migration overview"
-description: Learn about the different tools and options available to migrate your SQL Server databases to Azure SQL Managed Instance.
+ Title: "SQL Server to SQL Managed Instance: Migration overview"
+description: Learn about the tools and options available to migrate your SQL Server databases to Azure SQL Managed Instance.
Last updated 02/18/2020
# Migration overview: SQL Server to Azure SQL Managed Instance [!INCLUDE[appliesto--sqlmi](../../includes/appliesto-sqlmi.md)]
-Learn about different migration options and considerations to migrate your SQL Server to Azure SQL Managed Instance.
+Learn about the options and considerations for migrating your SQL Server databases to Azure SQL Managed Instance.
-You can migrate SQL Server running on-premises or on:
+You can migrate SQL Server databases running on-premises or on:
-- SQL Server on Virtual Machines -- Amazon Web Services (AWS) EC2 -- Amazon Relational Database Service (AWS RDS) -- Compute Engine (Google Cloud Platform - GCP) -- Cloud SQL for SQL Server (Google Cloud Platform ΓÇô GCP)
+- SQL Server on Azure Virtual Machines.
+- Amazon Web Services (AWS) Elastic Compute Cloud (EC2).
+- AWS Relational Database Service (RDS).
+- Compute Engine in Google Cloud Platform (GCP).
+- Cloud SQL for SQL Server in GCP.
For other migration guides, see [Database Migration](https://docs.microsoft.com/data-migration). ## Overview
-[Azure SQL Managed Instance](../../managed-instance/sql-managed-instance-paas-overview.md) is a recommended target option for SQL Server workloads that require a fully managed service without having to manage virtual machines or their operating systems. SQL Managed Instance enables you to lift-and-shift your on-premises applications to Azure with minimal application or database changes while having complete isolation of your instances with native virtual network (VNet) support.
+[Azure SQL Managed Instance](../../managed-instance/sql-managed-instance-paas-overview.md) is a recommended target option for SQL Server workloads that require a fully managed service without having to manage virtual machines or their operating systems. SQL Managed Instance enables you to move your on-premises applications to Azure with minimal application or database changes. It offers complete isolation of your instances with native virtual network support.
## Considerations
-The key factors to consider when evaluating migration options depend on:
+The key factors to consider when you're evaluating migration options are:
- Number of servers and databases - Size of databases - Acceptable business downtime during the migration process
-One of the key benefits of migrating your SQL Servers to SQL Managed Instance is that you can choose to migrate the entire instance, or just a subset of individual databases. Carefully plan to include the following in your migration process:
+One of the key benefits of migrating your SQL Server databases to SQL Managed Instance is that you can choose to migrate the entire instance or just a subset of individual databases. Carefully plan to include the following in your migration process:
- All databases that need to be colocated to the same instance -- Instance-level objects required for your application, including logins, credentials, SQL Agent jobs and operators, and server-level triggers.
+- Instance-level objects required for your application, including logins, credentials, SQL Agent jobs and operators, and server-level triggers
> [!NOTE]
-> Azure SQL Managed Instance guarantees 99.99% availability even in critical scenarios, so overhead caused by some features in SQL MI cannot be disabled. For more information, see the [root causes that might cause different performance on SQL Server and Azure SQL Managed Instance](https://azure.microsoft.com/blog/key-causes-of-performance-differences-between-sql-managed-instance-and-sql-server/) blog.
+> Azure SQL Managed Instance guarantees 99.99 percent availability, even in critical scenarios. Overhead caused by some features in SQL Managed Instance can't be disabled. For more information, see the [Key causes of performance differences between SQL Managed Instance and SQL Server](https://azure.microsoft.com/blog/key-causes-of-performance-differences-between-sql-managed-instance-and-sql-server/) blog entry.
-## Choose appropriate target
+## Choose an appropriate target
-Some general guidelines to help you choose the right service tier and characteristics of SQL Managed Instance to help match your [performance baseline](sql-server-to-managed-instance-performance-baseline.md):
+The following general guidelines can help you choose the right service tier and characteristics of SQL Managed Instance to help match your [performance baseline](sql-server-to-managed-instance-performance-baseline.md):
-- Use the CPU usage baseline to provision a managed instance that matches the number of cores your instance of SQL Server uses. It may be necessary to scale resources to match the [hardware generation characteristics](../../managed-instance/resource-limits.md#hardware-generation-characteristics).
+- Use the CPU usage baseline to provision a managed instance that matches the number of cores that your instance of SQL Server uses. It might be necessary to scale resources to match the [hardware generation characteristics](../../managed-instance/resource-limits.md#hardware-generation-characteristics).
- Use the memory usage baseline to choose a [vCore option](../../managed-instance/resource-limits.md#service-tier-characteristics) that appropriately matches your memory allocation. -- Use the baseline IO latency of the file subsystem to choose between General Purpose (latency greater than 5 ms) and Business Critical (latency less than 3 ms) service tiers. -- Use the baseline throughput to preallocate the size of the data and log files to achieve expected IO performance.
+- Use the baseline I/O latency of the file subsystem to choose between the General Purpose (latency greater than 5 ms) and Business Critical (latency less than 3 ms) service tiers.
+- Use the baseline throughput to preallocate the size of the data and log files to achieve expected I/O performance.
-You can choose compute and storage resources during deployment and then [change them after using the Azure portal](../../database/scale-resources.md) without incurring downtime for your application.
+You can choose compute and storage resources during deployment and then [change them afterward by using the Azure portal](../../database/scale-resources.md), without incurring downtime for your application.
> [!IMPORTANT]
-> Any discrepancy in the [managed instance virtual network requirements](../../managed-instance/connectivity-architecture-overview.md#network-requirements) can prevent you from creating new instances or using existing ones. Learn more about [creating new](../../managed-instance/virtual-network-subnet-create-arm-template.md) and [configuring existing](../../managed-instance/vnet-existing-add-subnet.md) networks.
+> Any discrepancy in the [virtual network requirements for managed instances](../../managed-instance/connectivity-architecture-overview.md#network-requirements) can prevent you from creating new instances or using existing ones. Learn more about [creating new](../../managed-instance/virtual-network-subnet-create-arm-template.md) and [configuring existing](../../managed-instance/vnet-existing-add-subnet.md) networks.
-Another key consideration in the selection of the target service tier in Azure SQL Managed Instance (General Purpose Vs Business Critical) is the availability of certain features like In-Memory OLTP that is only available in Business Critical tier.
+Another key consideration in the selection of the target service tier in Azure SQL Managed Instance (General Purpose versus Business Critical) is the availability of certain features, like In-Memory OLTP, that are available only in the Business Critical tier.
### SQL Server VM alternative
-Your business may have requirements that make SQL Server on Azure VMs a more suitable target than Azure SQL Managed Instance.
+Your business might have requirements that make [SQL Server on Azure Virtual Machines](../../virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md) a more suitable target than Azure SQL Managed Instance.
-If the following apply to your business, consider moving to a SQL Server VM instead:
--- If you require direct access to the operating system or file system, such as to install third-party or custom agents on the same virtual machine with SQL Server. -- If you have strict dependency on features that are still not supported, such as FileStream/FileTable, PolyBase, and cross-instance transactions. -- If you absolutely need to stay at a specific version of SQL Server (2012, for instance). -- If your compute requirements are much lower than managed instance offers (one vCore, for instance), and database consolidation is not an acceptable option.
+If one of the following conditions applies to your business, consider moving to a SQL Server virtual machine (VM) instead:
+- You require direct access to the operating system or file system, such as to install third-party or custom agents on the same virtual machine with SQL Server.
+- You have strict dependency on features that are still not supported, such as FileStream/FileTable, PolyBase, and cross-instance transactions.
+- You need to stay at a specific version of SQL Server (2012, for example).
+- Your compute requirements are much lower than a managed instance offers (one vCore, for example), and database consolidation is not an acceptable option.
## Migration tools -
-The recommended tools for migration are the Data Migration Assistant and the Azure Database Migration Service. There are other alternative migration options available as well.
-
-### Recommended tools
-
-The following table lists the recommended migration tools:
+We recommend the following migration tools:
|Technology | Description| |||
-| [Azure Migrate](../../../migrate/how-to-create-azure-sql-assessment.md) | Azure Migrate for Azure SQL allows you to discover and assess your SQL data estate at scale when on VMware, providing Azure SQL deployment recommendations, target sizing, and monthly estimates. |
-|[Azure Database Migration Service (DMS)](../../../dms/tutorial-sql-server-to-managed-instance.md) | First party Azure service that supports migration in the offline mode for applications that can afford downtime during the migration process. Unlike the continuous migration in online mode, offline mode migration runs a one-time restore of a full database backup from the source to the target. |
-|[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | SQL Managed Instance supports RESTORE of native SQL Server database backups (.bak files), making it the easiest migration option for customers who can provide full database backups to Azure storage. Full and differential backups are also supported and documented in the [migration assets section](#migration-assets) later in this article.|
-|[Log Replay Service (LRS)](../../managed-instance/log-replay-service-migrate.md) | This is a cloud service enabled for Managed Instance based on the SQL Server log shipping technology, making it a migration option for customers who can provide full, differential, and log database backups to Azure storage. LRS is used to restore backup files from Azure Blob Storage to SQL Managed Instance.|
+| [Azure Migrate](../../../migrate/how-to-create-azure-sql-assessment.md) | This Azure service helps you discover and assess your SQL data estate at scale on VMware. It provides Azure SQL deployment recommendations, target sizing, and monthly estimates. |
+|[Azure Database Migration Service](../../../dms/tutorial-sql-server-to-managed-instance.md) | This Azure service supports migration in the offline mode for applications that can afford downtime during the migration process. Unlike the continuous migration in online mode, offline mode migration runs a one-time restore of a full database backup from the source to the target. |
+|[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | SQL Managed Instance supports restore of native SQL Server database backups (.bak files). It's the easiest migration option for customers who can provide full database backups to Azure Storage. Full and differential backups are also supported and documented in the [section about migration assets](#migration-assets) later in this article.|
+|[Log Replay Service](../../managed-instance/log-replay-service-migrate.md) | This cloud service is enabled for SQL Managed Instance based on SQL Server log-shipping technology. It's a migration option for customers who can provide full, differential, and log database backups to Azure Storage. Log Replay Service is used to restore backup files from Azure Blob Storage to SQL Managed Instance.|
| | |
-### Alternative tools
- The following table lists alternative migration tools: |**Technology** |**Description** | |||
-|[Transactional replication](../../managed-instance/replication-transactional-overview.md) | Replicate data from source SQL Server database table(s) to SQL Managed Instance by providing a publisher-subscriber type migration option while maintaining transactional consistency. |
-|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)| The [bulk copy program (bcp) utility](/sql/tools/bcp-utility) copies data from an instance of SQL Server into a data file. Use the BCP utility to export the data from your source and import the data file into the target SQL Managed Instance.</br></br> For high-speed bulk copy operations to move data to Azure SQL Database, [Smart Bulk Copy tool](/samples/azure-samples/smartbulkcopy/smart-bulk-copy/) can be used to maximize transfer speeds by leveraging parallel copy tasks. |
-|[Import Export Wizard / BACPAC](../../database/database-import.md?tabs=azure-powershell)| [BACPAC](/sql/relational-databases/data-tier-applications/data-tier-applications#bacpac) is a Windows file with a `.bacpac` extension that encapsulates a database's schema and data. BACPAC can be used to both export data from a source SQL Server and to import the file back into Azure SQL Managed Instance. |
-|[Azure Data Factory (ADF)](../../../data-factory/connector-azure-sql-managed-instance.md)| The [Copy activity](../../../data-factory/copy-activity-overview.md) in Azure Data Factory migrates data from source SQL Server database(s) to SQL Managed Instance using built-in connectors and an [Integration Runtime](../../../data-factory/concepts-integration-runtime.md).</br> </br> ADF supports a wide range of [connectors](../../../data-factory/connector-overview.md) to move data from SQL Server sources to SQL Managed Instance. |
-| | |
+|[Transactional replication](../../managed-instance/replication-transactional-overview.md) | Replicate data from source SQL Server database tables to SQL Managed Instance by providing a publisher-subscriber type migration option while maintaining transactional consistency. | |
+|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)| The [bulk copy program (bcp) tool](/sql/tools/bcp-utility) copies data from an instance of SQL Server into a data file. Use the tool to export the data from your source and import the data file into the target SQL managed instance. </br></br> For high-speed bulk copy operations to move data to Azure SQL Managed Instance, you can use the [Smart Bulk Copy tool](/samples/azure-samples/smartbulkcopy/smart-bulk-copy/) to maximize transfer speed by taking advantage of parallel copy tasks. |
+|[Import Export Wizard/BACPAC](../../database/database-import.md?tabs=azure-powershell)| [BACPAC](/sql/relational-databases/data-tier-applications/data-tier-applications#bacpac) is a Windows file with a .bacpac extension that encapsulates a database's schema and data. You can use BACPAC to both export data from a SQL Server source and import the data back into Azure SQL Managed Instance. |
+|[Azure Data Factory](../../../data-factory/connector-azure-sql-managed-instance.md)| The [Copy activity](../../../data-factory/copy-activity-overview.md) in Azure Data Factory migrates data from source SQL Server databases to SQL Managed Instance by using built-in connectors and an [integration runtime](../../../data-factory/concepts-integration-runtime.md).</br> </br> Data Factory supports a wide range of [connectors](../../../data-factory/connector-overview.md) to move data from SQL Server sources to SQL Managed Instance. |
## Compare migration options
-Compare migration options to choose the path appropriate to your business needs.
-
-### Recommended options
+Compare migration options to choose the path that's appropriate to your business needs.
-The following table compares the recommended migration options:
+The following table compares the migration options that we recommend:
|Migration option |When to use |Considerations | ||||
-|[Azure Database Migration Service (DMS)](../../../dms/tutorial-sql-server-to-managed-instance.md) | - Migrate single databases or multiple databases at scale. </br> - Can accommodate downtime during migration process. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migrations at scale can be automated via [PowerShell](../../../dms/howto-sql-server-to-azure-sql-managed-instance-powershell-offline.md). </br> - Time to complete migration is dependent on database size and impacted by backup and restore time. </br> - Sufficient downtime may be required. |
-|[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | - Migrate individual line-of-business application database(s). </br> - Quick and easy migration without a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Database backup uses multiple threads to optimize data transfer to Azure Blob storage but ISV bandwidth and database size can impact transfer rate. </br> - Downtime should accommodate the time required to perform a full backup and restore (which is a size of data operation).|
-|[Log Replay Service (LRS)](../../managed-instance/log-replay-service-migrate.md) | - Migrate individual line-of-business application database(s). </br> - More control is needed for database migrations. </br> </br> Supported sources: </br> - SQL Server (2008 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - The migration entails making full database backups on SQL Server and copying backup files to Azure Blob Storage. LRS is used to restore backup files from Azure Blob Storage to SQL Managed Instance. </br> - Databases being restored during the migration process will be in a restoring mode and cannot be used to read or write until the process has been completed..|
+|[Azure Database Migration Service](../../../dms/tutorial-sql-server-to-managed-instance.md) | - Migrate single databases or multiple databases at scale. </br> - Can accommodate downtime during the migration process. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migrations at scale can be automated via [PowerShell](../../../dms/howto-sql-server-to-azure-sql-managed-instance-powershell-offline.md). </br> - Time to complete migration depends on database size and is affected by backup and restore time. </br> - Sufficient downtime might be required. |
+|[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | - Migrate individual line-of-business application databases. </br> - Quick and easy migration without a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Database backup uses multiple threads to optimize data transfer to Azure Blob Storage, but partner bandwidth and database size can affect transfer rate. </br> - Downtime should accommodate the time required to perform a full backup and restore (which is a size of data operation).|
+|[Log Replay Service](../../managed-instance/log-replay-service-migrate.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> </br> Supported sources: </br> - SQL Server (2008 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - The migration entails making full database backups on SQL Server and copying backup files to Azure Blob Storage. Log Replay Service is used to restore backup files from Azure Blob Storage to SQL Managed Instance. </br> - Databases being restored during the migration process will be in a restoring mode and can't be used to read or write until the process has finished.|
| | | |
-### Alternative options
- The following table compares the alternative migration options:
-|Method / technology |When to use |Considerations |
+|Method or technology |When to use |Considerations |
||||
-|[Transactional replication](../../managed-instance/replication-transactional-overview.md) | - Migrate by continuously publishing changes from source database tables to target SQL Managed Instance database tables. </br> - Full or partial database migrations of selected tables (subset of database). </br> </br> Supported sources: </br> - SQL Server (2012 - 2019) with some limitations </br> - AWS EC2 </br> - GCP Compute SQL Server VM | </br> - Setup is relatively complex compared to other migration options. </br> - Provides a continuous replication option to migrate data (without taking the databases offline).</br> - Transactional replication has a number of limitations to consider when setting up the Publisher on the source SQL Server. See [Limitations on Publishing Objects](/sql/relational-databases/replication/publish/publish-data-and-database-objects#limitations-on-publishing-objects) to learn more. </br> - Capability to [monitor replication activity](/sql/relational-databases/replication/monitor/monitoring-replication) is available. |
-|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)| - Migrating full or partial data migrations. </br> - Can accommodate downtime. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Requires downtime for exporting data from source and importing into target. </br> - The file formats and data types used in the export / import need to be consistent with table schemas. |
-|[Import Export Wizard / BACPAC](../../database/database-import.md)| - Migrate individual Line-of-business application database(s). </br>- Suited for smaller databases. </br> Does not require a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 - 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | </br> - Requires downtime as data needs to be exported at the source and imported at the destination. </br> - The file formats and data types used in the export / import need to be consistent with table schemas to avoid truncation / data type mismatch errors. </br> - Time taken to export a database with a large number of objects can be significantly higher. |
-|[Azure Data Factory (ADF)](../../../data-factory/connector-azure-sql-managed-instance.md)| - Migrating and/or transforming data from source SQL Server database(s).</br> - Merging data from multiple sources of data to Azure SQL Managed Instance typically for Business Intelligence (BI) workloads. </br> - Requires creating data movement pipelines in ADF to move data from source to destination. </br> - [Cost](https://azure.microsoft.com/pricing/details/data-factory/data-pipeline/) is an important consideration and is based on the pipeline triggers, activity runs, duration of data movement, etc. |
+|[Transactional replication](../../managed-instance/replication-transactional-overview.md) | - Migrate by continuously publishing changes from source database tables to target SQL Managed Instance database tables. </br> - Do full or partial database migrations of selected tables (subset of a database). </br> </br> Supported sources: </br> - SQL Server (2012 to 2019) with some limitations </br> - AWS EC2 </br> - GCP Compute SQL Server VM | </br> - Setup is relatively complex compared to other migration options. </br> - Provides a continuous replication option to migrate data (without taking the databases offline).</br> - Transactional replication has limitations to consider when you're setting up the publisher on the source SQL Server instance. See [Limitations on publishing objects](/sql/relational-databases/replication/publish/publish-data-and-database-objects#limitations-on-publishing-objects) to learn more. </br> - Capability to [monitor replication activity](/sql/relational-databases/replication/monitor/monitoring-replication) is available. |
+|[Bulk copy](/sql/relational-databases/import-export/import-and-export-bulk-data-by-using-the-bcp-utility-sql-server)| - Do full or partial data migrations. </br> - Can accommodate downtime. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Requires downtime for exporting data from the source and importing into the target. </br> - The file formats and data types used in the export or import need to be consistent with table schemas. |
+|[Import Export Wizard/BACPAC](../../database/database-import.md)| - Migrate individual line-of-business application databases. </br>- Suited for smaller databases. </br> Does not require a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | </br> - Requires downtime because data needs to be exported at the source and imported at the destination. </br> - The file formats and data types used in the export or import need to be consistent with table schemas to avoid truncation or data-type mismatch errors. </br> - Time taken to export a database with a large number of objects can be significantly higher. |
+|[Azure Data Factory](../../../data-factory/connector-azure-sql-managed-instance.md)| - Migrate and/or transform data from source SQL Server databases.</br> - Merging data from multiple sources of data to Azure SQL Managed Instance is typically for business intelligence (BI) workloads. </br> - Requires creating data movement pipelines in Data Factory to move data from source to destination. </br> - [Cost](https://azure.microsoft.com/pricing/details/data-factory/data-pipeline/) is an important consideration and is based on factors like pipeline triggers, activity runs, and duration of data movement. |
| | | | ## Feature interoperability
-There are additional considerations when migrating workloads that rely on other SQL Server features.
+There are more considerations when you're migrating workloads that rely on other SQL Server features.
-#### SQL Server Integration Services
+### SQL Server Integration Services
-Migrate SQL Server Integration Services (SSIS) packages and projects in SSISDB to Azure SQL Managed Instance using [Azure Database Migration Service (DMS)](../../../dms/how-to-migrate-ssis-packages-managed-instance.md).
+Migrate SQL Server Integration Services (SSIS) packages and projects in SSISDB to Azure SQL Managed Instance by using [Azure Database Migration Service](../../../dms/how-to-migrate-ssis-packages-managed-instance.md).
-Only SSIS packages in SSISDB starting with SQL Server 2012 are supported for migration. Convert legacy SSIS packages before migration. See the [project conversion tutorial](/sql/integration-services/lesson-6-2-converting-the-project-to-the-project-deployment-model) to learn more.
+Only SSIS packages in SSISDB starting with SQL Server 2012 are supported for migration. Convert older SSIS packages before migration. See the [project conversion tutorial](/sql/integration-services/lesson-6-2-converting-the-project-to-the-project-deployment-model) to learn more.
-#### SQL Server Reporting Services
+### SQL Server Reporting Services
-SQL Server Reporting Services (SSRS) reports can be migrated to paginated reports in Power BI. Use theΓÇ»[RDL Migration Tool](https://github.com/microsoft/RdlMigration) to help prepare, and migrate your reports. This tool was developed by Microsoft to help customers migrate RDL reports from their SSRS servers to Power BI. It is available on GitHub, and it documents an end-to-end walkthrough of the migration scenario.
+You can migrate SQL Server Reporting Services (SSRS) reports to paginated reports in Power BI. Use theΓÇ»[RDL Migration Tool](https://github.com/microsoft/RdlMigration) to help prepare and migrate your reports. Microsoft developed this tool to help customers migrate Report Definition Language (RDL) reports from their SSRS servers to Power BI. It's available on GitHub, and it documents an end-to-end walkthrough of the migration scenario.
-#### SQL Server Analysis Services
+### SQL Server Analysis Services
-SQL Server Analysis Services Tabular models from SQL Server 2012 and above can be migrated to Azure Analysis Services, which is a PaaS deployment model for Analysis Services Tabular model in Azure. You can learn more about migrating on-prem models to Azure Analysis Services in this [video tutorial](https://azure.microsoft.com/resources/videos/azure-analysis-services-moving-models/).
+SQL Server Analysis Services tabular models from SQL Server 2012 and later can be migrated to Azure Analysis Services, which is a platform as a service (PaaS) deployment model for the Analysis Services tabular model in Azure. You can learn more about migrating on-premises models to Azure Analysis Services in [this video tutorial](https://azure.microsoft.com/resources/videos/azure-analysis-services-moving-models/).
-Alternatively, you can also consider migrating your on-premises Analysis Services Tabular models to [Power BI Premium using the new XMLA read/write endpoints](/power-bi/admin/service-premium-connect-tools).
-> [!NOTE]
-> Power BI XMLA read/write endpoints functionality is currently in Public Preview and should not be considered for Production workloads until the functionality becomes Generally Available.
+Alternatively, you can consider migrating your on-premises Analysis Services tabular models to [Power BI Premium by using the new XMLA read/write endpoints](/power-bi/admin/service-premium-connect-tools).
-#### High availability
+### High availability
-The SQL Server high availability features Always On failover cluster instances and Always On availability groups become obsolete on the target Azure SQL Managed Instance as high availability architecture is already built into both [General Purpose (standard availability model)](../../database/high-availability-sla.md#basic-standard-and-general-purpose-service-tier-locally-redundant-availability) and [Business Critical (premium availability model)](../../database/high-availability-sla.md#premium-and-business-critical-service-tier-locally-redundant-availability) SQL Managed Instance. The premium availability model also provides read scale-out that allows connecting into one of the secondary nodes for read-only purposes.
+The SQL Server high-availability features Always On failover cluster instances and Always On availability groups become obsolete on the target SQL managed instance. High-availability architecture is already built into both [General Purpose (standard availability model)](../../database/high-availability-sla.md#basic-standard-and-general-purpose-service-tier-locally-redundant-availability) and [Business Critical (premium availability model)](../../database/high-availability-sla.md#premium-and-business-critical-service-tier-locally-redundant-availability) service tiers for SQL Managed Instance. The premium availability model also provides read scale-out that allows connecting into one of the secondary nodes for read-only purposes.
-Beyond the high availability architecture that is included in SQL Managed Instance, there is also the [auto-failover groups](../../database/auto-failover-group-overview.md) feature that allows you to manage the replication and failover of databases in a managed instance to another region.
+Beyond the high-availability architecture that's included in SQL Managed Instance, the [auto-failover groups](../../database/auto-failover-group-overview.md) feature allows you to manage the replication and failover of databases in a managed instance to another region.
-#### SQL Agent jobs
+### SQL Agent jobs
-Use the offline Azure Database Migration Service (DMS) option to migrate [SQL Agent jobs](../../../dms/howto-sql-server-to-azure-sql-managed-instance-powershell-offline.md). Otherwise, script the jobs in Transact-SQL (T-SQL) using SQL Server Management Studio and then manually recreate them on the target SQL Managed Instance.
+Use the offline Azure Database Migration Service option to migrate [SQL Agent jobs](../../../dms/howto-sql-server-to-azure-sql-managed-instance-powershell-offline.md). Otherwise, script the jobs in Transact-SQL (T-SQL) by using SQL Server Management Studio and then manually re-create them on the target SQL managed instance.
> [!IMPORTANT]
-> Currently, Azure DMS only supports jobs with T-SQL subsystem steps. Jobs with SSIS package steps will have to be manually migrated.
+> Currently, Azure Database Migration Service supports only jobs with T-SQL subsystem steps. Jobs with SSIS package steps have to be manually migrated.
-#### Logins and groups
+### Logins and groups
-SQL logins from the source SQL Server can be moved to Azure SQL Managed Instance using Database Migration Service (DMS) in offline mode. Use the **[Select logins](../../../dms/tutorial-sql-server-to-managed-instance.md#select-logins)** blade in the **Migration Wizard** to migrate logins to your target SQL Managed Instance.
+Move SQL logins from the SQL Server source to Azure SQL Managed Instance by using Database Migration Service in offline mode. Use the [Select logins](../../../dms/tutorial-sql-server-to-managed-instance.md#select-logins) pane in the Migration Wizard to migrate logins to your target SQL managed instance.
-By default, Azure Database Migration Service only supports migrating SQL logins. However, you can enable the ability to migrate Windows logins by:
+By default, Azure Database Migration Service supports migrating only SQL logins. However, you can enable the migration of Windows logins by:
-Ensuring that the target SQL Managed Instance has Azure AD read access, which can be configured via the Azure portal by a user with the **Global Administrator** role.
-Configuring your Azure Database Migration Service instance to enable Windows user/group login migrations, which is set up via the Azure portal, on the Configuration page. After enabling this setting, restart the service for the changes to take effect.
+- Ensuring that the target SQL managed instance has Azure Active Directory (Azure AD) read access. A user who has the Global Administrator role can configure that access via the Azure portal.
+- Configuring Azure Database Migration Service to enable Windows user or group login migrations. You set this up via the Azure portal, on the **Configuration** page. After you enable this setting, restart the service for the changes to take effect.
-After restarting the service, Windows user/group logins appear in the list of logins available for migration. For any Windows user/group logins you migrate, you are prompted to provide the associated domain name. Service user accounts (account with domain name NT AUTHORITY) and virtual user accounts (account name with domain name NT SERVICE) are not supported.
+After you restart the service, Windows user or group logins appear in the list of logins available for migration. For any Windows user or group logins that you migrate, you're prompted to provide the associated domain name. Service user accounts (accounts with the domain name NT AUTHORITY) and virtual user accounts (accounts with the domain name NT SERVICE) are not supported. To learn more, see [How to migrate Windows users and groups in a SQL Server instance to Azure SQL Managed Instance using T-SQL](../../managed-instance/migrate-sql-server-users-to-instance-transact-sql-tsql-tutorial.md).
-To learn more, see [how to migrate windows users and groups in a SQL Server instance to Azure SQL Managed Instance using T-SQL](../../managed-instance/migrate-sql-server-users-to-instance-transact-sql-tsql-tutorial.md).
+Alternatively, you can use the [PowerShell utility](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins) specially designed by Microsoft data migration architects. The utility uses PowerShell to create a T-SQL script to re-create logins and select database users from the source to the target.
-Alternatively, you can use the [PowerShell utility tool](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins) specially designed by the Microsoft Data Migration Architects. The utility uses PowerShell to create a T-SQL script to recreate logins and select database users from the source to the target. The tool automatically maps Windows AD accounts to Azure AD accounts, and can do a UPN lookup for each login against the source Active Directory. The tool scripts custom server and database roles, as well as role membership, database role, and user permissions. Contained databases are not currently supported and only a subset of possible SQL Server permissions are scripted.
+The PowerShell utility automatically maps Windows Server Active Directory accounts to Azure AD accounts, and it can do a UPN lookup for each login against the source Active Directory instance. The utility scripts custom server and database roles, along with role membership and user permissions. Contained databases are not yet supported, and only a subset of possible SQL Server permissions are scripted.
-#### Encryption
+### Encryption
-When migrating databases protected by [Transparent Data Encryption](../../database/transparent-data-encryption-tde-overview.md) to a managed instance using native restore option, [migrate the corresponding certificate](../../managed-instance/tde-certificate-migrate.md) from the source SQL Server to the target SQL Managed Instance *before* database restore.
+When you're migrating databases protected by [Transparent Data Encryption](../../database/transparent-data-encryption-tde-overview.md) to a managed instance by using the native restore option, [migrate the corresponding certificate](../../managed-instance/tde-certificate-migrate.md) from the source SQL Server instance to the target SQL managed instance *before* database restore.
-#### System databases
+### System databases
-Restore of system databases is not supported. To migrate instance-level objects (stored in master or msdb databases), script them using Transact-SQL (T-SQL) and then recreate them on the target managed instance.
+Restore of system databases is not supported. To migrate instance-level objects (stored in the master and msdb databases), script them by using T-SQL and then re-create them on the target managed instance.
-#### In-Memory OLTP (Memory-optimized tables)
+### In-Memory OLTP (memory-optimized tables)
-SQL Server provides In-Memory OLTP capability that allows usage of memory-optimized tables, memory-optimized table types and natively compiled SQL modules to run workloads that have high throughput and low latency transactional processing requirements.
+SQL Server provides an In-Memory OLTP capability. It allows usage of memory-optimized tables, memory-optimized table types, and natively compiled SQL modules to run workloads that have high-throughput and low-latency requirements for transactional processing.
> [!IMPORTANT]
-> In-Memory OLTP is only supported in the Business Critical tier in Azure SQL Managed Instance (and not supported in the General Purpose tier).
+> In-Memory OLTP is supported only in the Business Critical tier in Azure SQL Managed Instance. It's not supported in the General Purpose tier.
-If you have memory-optimized tables or memory-optimized table types in your on-premises SQL Server and you are looking to migrate to Azure SQL Managed Instance, you should either:
+If you have memory-optimized tables or memory-optimized table types in your on-premises SQL Server instance and you want to migrate to Azure SQL Managed Instance, you should either:
-- Choose Business Critical tier for your target Azure SQL Managed Instance that supports In-Memory OLTP, Or-- If you want to migrate to General Purpose tier in Azure SQL Managed Instance, remove memory-optimized tables, memory-optimized table types and natively compiled SQL modules that interact with memory-optimized objects before migrating your database(s). The following T-SQL query can be used to identify all objects that need to be removed before migration to General Purpose tier:
+- Choose the Business Critical tier for your target SQL managed instance that supports In-Memory OLTP.
+- If you want to migrate to the General Purpose tier in Azure SQL Managed Instance, remove memory-optimized tables, memory-optimized table types, and natively compiled SQL modules that interact with memory-optimized objects before migrating your databases. You can use the following T-SQL query to identify all objects that need to be removed before migration to the General Purpose tier:
-```tsql
-SELECT * FROM sys.tables WHERE is_memory_optimized=1
-SELECT * FROM sys.table_types WHERE is_memory_optimized=1
-SELECT * FROM sys.sql_modules WHERE uses_native_compilation=1
-```
+ ```tsql
+ SELECT * FROM sys.tables WHERE is_memory_optimized=1
+ SELECT * FROM sys.table_types WHERE is_memory_optimized=1
+ SELECT * FROM sys.sql_modules WHERE uses_native_compilation=1
+ ```
-To learn more about in-memory technologies, see [Optimize performance by using in-memory technologies in Azure SQL Database and Azure SQL Managed Instance](../../in-memory-oltp-overview.md)
+To learn more about in-memory technologies, see [Optimize performance by using in-memory technologies in Azure SQL Database and Azure SQL Managed Instance](../../in-memory-oltp-overview.md).
-## Leverage advanced features
+## Advanced features
-Be sure to take advantage of the advanced cloud-based features offered by SQL Managed Instance. For example, you no longer need to worry about managing backups as the service does it for you. You can restore to any [point in time within the retention period](../../database/recovery-using-backups.md#point-in-time-restore). Additionally, you do not need to worry about setting up high availability, as [high availability is built in](../../database/high-availability-sla.md).
+Be sure to take advantage of the advanced cloud-based features in SQL Managed Instance. For example, you don't need to worry about managing backups because the service does it for you. You can restore to any [point in time within the retention period](../../database/recovery-using-backups.md#point-in-time-restore). Additionally, you don't need to worry about setting up high availability, because [high availability is built in](../../database/high-availability-sla.md).
-To strengthen security, consider usingΓÇ»[Azure Active Directory Authentication](../../database/authentication-aad-overview.md), [auditing](../../managed-instance/auditing-configure.md),ΓÇ»[threat detection](../../database/azure-defender-for-sql.md),ΓÇ»[row-level security](/sql/relational-databases/security/row-level-security), andΓÇ»[dynamic data masking](/sql/relational-databases/security/dynamic-data-masking).
+To strengthen security, consider usingΓÇ»[Azure AD authentication](../../database/authentication-aad-overview.md), [auditing](../../managed-instance/auditing-configure.md),ΓÇ»[threat detection](../../database/azure-defender-for-sql.md),ΓÇ»[row-level security](/sql/relational-databases/security/row-level-security), andΓÇ»[dynamic data masking](/sql/relational-databases/security/dynamic-data-masking).
-In addition to advanced management and security features, SQL Managed Instance provides a set of advanced tools that can help you [monitor and tune your workload](../../database/monitor-tune-overview.md). [Azure SQL Analytics](../../../azure-monitor/insights/azure-sql.md) allows you to monitor a large set of managed instances in a centralized manner. [Automatic tuning](/sql/relational-databases/automatic-tuning/automatic-tuning#automatic-plan-correction) in managed instances continuously monitors performance of your SQL plan execution statistics and automatically fixes the identified performance issues.
+In addition to advanced management and security features, SQL Managed Instance provides advanced tools that can help you [monitor and tune your workload](../../database/monitor-tune-overview.md). [Azure SQL Analytics](../../../azure-monitor/insights/azure-sql.md) allows you to monitor a large set of managed instances in a centralized way. [Automatic tuning](/sql/relational-databases/automatic-tuning/automatic-tuning#automatic-plan-correction) in managed instances continuously monitors performance of your SQL plan execution and automatically fixes the identified performance problems.
-Some features are only available once the [database compatibility level](/sql/relational-databases/databases/view-or-change-the-compatibility-level-of-a-database) is changed to the latest compatibility level (150).
+Some features are available only after the [database compatibility level](/sql/relational-databases/databases/view-or-change-the-compatibility-level-of-a-database) is changed to the latest compatibility level (150).
## Migration assets
-For additional assistance, see the following resources that were developed for real world migration projects.
+For more assistance, see the following resources that were developed for real-world migration projects.
|Asset |Description | |||
-|[Data workload assessment model and tool](https://github.com/Microsoft/DataMigrationTeam/tree/master/Data%20Workload%20Assessment%20Model%20and%20Tool)| This tool provides suggested "best fit" target platforms, cloud readiness, and application/database remediation level for a given workload. It offers simple, one-click calculation and report generation that helps to accelerate large estate assessments by providing and automated and uniform target platform decision process.|
-|[DBLoader Utility](https://github.com/microsoft/DataMigrationTeam/tree/master/DBLoader%20Utility)|The DBLoader can be used to load data from delimited text files into SQL Server. This Windows console utility uses the SQL Server native client bulkload interface, which works on all versions of SQL Server, including Azure SQL MI.|
-|[Utility to move On-Premises SQL Server Logins to Azure SQL Managed Instance](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins)|A PowerShell script that creates a T-SQL command script to re-create logins and select database users from on-premises SQL Server to Azure SQL Managed Instance. The tool allows automatic mapping of Windows AD accounts to Azure AD accounts as well as optionally migrating SQL Server native logins.|
-|[Perfmon data collection automation using Logman](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Perfmon%20Data%20Collection%20Automation%20Using%20Logman)|A tool that collects Perform data to understand baseline performance that assists in the migration target recommendation. This tool that uses logman.exe to create the command that will create, start, stop, and delete performance counters set on a remote SQL Server.|
-|[Whitepaper - Database migration to Azure SQL Managed Instance by restoring full and differential backups](https://github.com/microsoft/DataMigrationTeam/blob/master/Whitepapers/Database%20migrations%20to%20Azure%20SQL%20DB%20Managed%20Instance%20-%20%20Restore%20with%20Full%20and%20Differential%20backups.pdf)|This whitepaper provides guidance and steps to help accelerate migrations from SQL Server to Azure SQL Managed Instance if you only have full and differential backups (and no log backup capability).|
+|[Data workload assessment model and tool](https://github.com/Microsoft/DataMigrationTeam/tree/master/Data%20Workload%20Assessment%20Model%20and%20Tool)| This tool provides suggested "best fit" target platforms, cloud readiness, and an application/database remediation level for a workload. It offers simple, one-click calculation and report generation that helps to accelerate large estate assessments by providing an automated and uniform decision process for target platforms.|
+|[DBLoader utility](https://github.com/microsoft/DataMigrationTeam/tree/master/DBLoader%20Utility)|You can use DBLoader to load data from delimited text files into SQL Server. This Windows console utility uses the SQL Server native client bulk-load interface. The interface works on all versions of SQL Server, along with Azure SQL Managed Instance.|
+|[Utility to move on-premises SQL Server logins to Azure SQL Managed Instance](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/MoveLogins)|A PowerShell script can create a T-SQL command script to re-create logins and select database users from on-premises SQL Server to Azure SQL Managed Instance. The tool allows automatic mapping of Windows Server Active Directory accounts to Azure AD accounts, along with optionally migrating SQL Server native logins.|
+|[Perfmon data collection automation by using Logman](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Perfmon%20Data%20Collection%20Automation%20Using%20Logman)|You can use the Logman tool to collect Perfmon data (to help you understand baseline performance) and get migration target recommendations. This tool uses logman.exe to create the command that will create, start, stop, and delete performance counters set on a remote SQL Server instance.|
+|[Database migration to Azure SQL Managed Instance by restoring full and differential backups](https://github.com/microsoft/DataMigrationTeam/blob/master/Whitepapers/Database%20migrations%20to%20Azure%20SQL%20DB%20Managed%20Instance%20-%20%20Restore%20with%20Full%20and%20Differential%20backups.pdf)|This white paper provides guidance and steps to help accelerate migrations from SQL Server to Azure SQL Managed Instance if you have only full and differential backups (and no log backup capability).|
The Data SQL Engineering team developed these resources. This team's core charter is to unblock and accelerate complex modernization for data platform migration projects to Microsoft's Azure data platform. ## Next steps
-To start migrating your SQL Server to Azure SQL Managed Instance, see the [SQL Server to Azure SQL Managed Instance migration guide](sql-server-to-managed-instance-guide.md).
+- To start migrating your SQL Server databases to Azure SQL Managed Instance, see the [SQL Server to Azure SQL Managed Instance migration guide](sql-server-to-managed-instance-guide.md).
-- For a matrix of the Microsoft and third-party services and tools that are available to assist you with various database and data migration scenarios as well as specialty tasks, see [Service and tools for data migration](../../../dms/dms-tools-matrix.md).
+- For a matrix of services and tools that can help you with database and data migration scenarios as well as specialty tasks, see [Services and tools for data migration](../../../dms/dms-tools-matrix.md).
-- To learn more about Azure SQL Managed Instance see:
- - [Service Tiers in Azure SQL Managed Instance](../../managed-instance/sql-managed-instance-paas-overview.md#service-tiers)
+- To learn more about Azure SQL Managed Instance, see:
+ - [Service tiers in Azure SQL Managed Instance](../../managed-instance/sql-managed-instance-paas-overview.md#service-tiers)
- [Differences between SQL Server and Azure SQL Managed Instance](../../managed-instance/transact-sql-tsql-differences-sql-server.md)
- - [Azure total Cost of Ownership Calculator](https://azure.microsoft.com/pricing/tco/calculator/)
-
+ - [Azure Total Cost of Ownership Calculator](https://azure.microsoft.com/pricing/tco/calculator/)
-- To learn more about the framework and adoption cycle for Cloud migrations, see
+- To learn more about the framework and adoption cycle for cloud migrations, see:
- [Cloud Adoption Framework for Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/contoso-migration-scale)
- - [Best practices for costing and sizing workloads migrate to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs)
+ - [Best practices for costing and sizing workloads migrated to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs)
+- To assess the application access layer, see [Data Access Migration Toolkit (Preview)](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit).
-- To assess the Application access layer, see [Data Access Migration Toolkit (Preview)](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit)-- For details on how to perform Data Access Layer A/B testing see [Database Experimentation Assistant](/sql/dea/database-experimentation-assistant-overview).
+- For details on how to perform A/B testing at the data access layer, see [Database Experimentation Assistant](/sql/dea/database-experimentation-assistant-overview).
azure-sql Oracle To Sql On Azure Vm Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/virtual-machines/oracle-to-sql-on-azure-vm-guide.md
Title: "Oracle to SQL Server on Azure VM: Migration guide"
-description: This guide teaches you to migrate your Oracle schemas to SQL Server on Azure VMs using SQL Server Migration Assistant for Oracle.
+ Title: "Oracle to SQL Server on Azure Virtual Machines: Migration guide"
+description: This guide teaches you to migrate your Oracle schemas to SQL Server on Azure Virtual Machines by using SQL Server Migration Assistant for Oracle.
Last updated 11/06/2020
-# Migration guide: Oracle to SQL Server on Azure VM
+# Migration guide: Oracle to SQL Server on Azure Virtual Machines
[!INCLUDE[appliesto-sqldb-sqlmi](../../includes/appliesto-sqldb.md)]
-This guide teaches you to migrate your Oracle schemas to SQL Server on Azure VM using SQL Server Migration Assistant for Oracle.
+This guide teaches you to migrate your Oracle schemas to SQL Server on Azure Virtual Machines by using SQL Server Migration Assistant for Oracle.
For other migration guides, see [Database Migration](https://docs.microsoft.com/data-migration). ## Prerequisites
-To migrate your Oracle schema to SQL Server on Azure VM, you need:
+To migrate your Oracle schema to SQL Server on Azure Virtual Machines, you need:
-- To verify your source environment is supported.-- To download [SQL Server Migration Assistant (SSMA) for Oracle](https://www.microsoft.com/en-us/download/details.aspx?id=54258).
+- A supported source environment.
+- [SQL Server Migration Assistant (SSMA) for Oracle](https://www.microsoft.com/en-us/download/details.aspx?id=54258).
- A target [SQL Server VM](../../virtual-machines/windows/sql-vm-create-portal-quickstart.md).-- The [necessary permissions for SSMA for Oracle](/sql/ssma/oracle/connecting-to-oracle-database-oracletosql) and [provider](/sql/ssma/oracle/connect-to-oracle-oracletosql).-- Connectivity and sufficient permissions to access both source and target.
+- The [necessary permissions for SSMA for Oracle](/sql/ssma/oracle/connecting-to-oracle-database-oracletosql) and the [provider](/sql/ssma/oracle/connect-to-oracle-oracletosql).
+- Connectivity and sufficient permissions to access the source and the target.
## Pre-migration
-As you prepare for migrating to the cloud, verify that your source environment is supported and that you have addressed any prerequisites. This will help to ensure an efficient and successful migration.
+To prepare to migrate to the cloud, verify that your source environment is supported and that you've addressed any prerequisites. Doing so will help to ensure an efficient and successful migration.
-This part of the process involves conducting an inventory of the databases that you need to migrate, assessing those databases for potential migration issues or blockers, and then resolving any items you might have uncovered.
+This part of the process involves:
+- Conducting an inventory of the databases that you need to migrate.
+- Assessing those databases for potential migration problems or blockers.
+- Resolving any problems that you uncover.
### Discover
-Use the [MAP Toolkit](https://go.microsoft.com/fwlink/?LinkID=316883) to identify existing data sources and details about the features that are being used by your business to get a better understanding of and plan for the migration. This process involves scanning the network to identify all your organization's Oracle instances together with the version and features in use.
+Use [MAP Toolkit](https://go.microsoft.com/fwlink/?LinkID=316883) to identify existing data sources and details about the features your business is using. Doing so will give you a better understanding of the migration and help you plan for it. This process involves scanning the network to identify your organization's Oracle instances and the versions and features you're using.
+
+To use MAP Toolkit to do an inventory scan, follow these steps:
++
+1. Open [MAP Toolkit](https://go.microsoft.com/fwlink/?LinkID=316883).
-To use the MAP Toolkit to perform an inventory scan, follow these steps:
-1. Open the [MAP Toolkit](https://go.microsoft.com/fwlink/?LinkID=316883).
1. Select **Create/Select database**:
- ![Select database](./media/oracle-to-sql-on-azure-vm-guide/select-database.png)
+ ![Screenshot that shows the Create/Select database option.](./media/oracle-to-sql-on-azure-vm-guide/select-database.png)
-1. Select **Create an inventory database**, enter a name for the new inventory database you're creating, provide a brief description, and then select **OK**:
+1. Select **Create an inventory database**. Enter a name for the new inventory database you're creating, provide a brief description, and then select **OK**:
- :::image type="content" source="media/oracle-to-sql-on-azure-vm-guide/create-inventory-database.png" alt-text="Create an inventory database":::
+ :::image type="content" source="media/oracle-to-sql-on-azure-vm-guide/create-inventory-database.png" alt-text="Screenshot that shows the interface for creating an inventory database.":::
1. Select **Collect inventory data** to open the **Inventory and Assessment Wizard**:
- :::image type="content" source="media/oracle-to-sql-on-azure-vm-guide/collect-inventory-data.png" alt-text="Collect inventory data":::
+ :::image type="content" source="media/oracle-to-sql-on-azure-vm-guide/collect-inventory-data.png" alt-text="Screenshot that shows the Collect inventory data link.":::
-1. In the **Inventory and Assessment Wizard**, choose **Oracle** and then select **Next**:
- ![Choose oracle](./media/oracle-to-sql-on-azure-vm-guide/choose-oracle.png)
+1. In the **Inventory and Assessment Wizard**, select **Oracle**, and then select **Next**:
-1. Choose the computer search option that best suits your business needs and environment, and then select **Next**:
+ ![Screenshot that shows the Inventory Scenarios page of the Inventory and Assessment Wizard.](./media/oracle-to-sql-on-azure-vm-guide/choose-oracle.png)
- ![Choose the computer search option that best suits your business needs](./media/oracle-to-sql-on-azure-vm-guide/choose-search-option.png)
+1. Select the computer search option that best suits your business needs and environment, and then select **Next**:
+
+ ![Screenshot that shows the Discovery Methods page of the Inventory and Assessment Wizard.](./media/oracle-to-sql-on-azure-vm-guide/choose-search-option.png)
1. Either enter credentials or create new credentials for the systems that you want to explore, and then select **Next**:
- ![Enter credentials](./media/oracle-to-sql-on-azure-vm-guide/choose-credentials.png)
+ ![Screenshot that shows the All Computers Credentials page of the Inventory and Assessment Wizard.](./media/oracle-to-sql-on-azure-vm-guide/choose-credentials.png)
++
+1. Set the order of the credentials, and then select **Next**:
-1. Set the order of the credentials, and then select **Next**:
+ ![Screenshot that shows the Credentials Order page of the Inventory and Assessment Wizard.](./media/oracle-to-sql-on-azure-vm-guide/set-credential-order.png)
- ![Set credential order](./media/oracle-to-sql-on-azure-vm-guide/set-credential-order.png)
-1. Specify the credentials for each computer you want to discover. You can use unique credentials for every computer/machine, or you can choose to use the **All Computer Credentials** list:
+1. Enter the credentials for each computer you want to discover. You can use unique credentials for every computer/machine, or you can use the All Computers credential list.
+ ![Screenshot that shows the Specify Computers and Credentials page of the Inventory and Assessment Wizard.](./media/oracle-to-sql-on-azure-vm-guide/specify-credentials-for-each-computer.png)
- ![Specify the credentials for each computer you want to discover](./media/oracle-to-sql-on-azure-vm-guide/specify-credentials-for-each-computer.png)
+1. Verify your selections, and then select **Finish**:
-1. Verify your selection summary, and then select **Finish**:
+ ![Screenshot that shows the Summary page of the Inventory and Assessment Wizard.](./media/oracle-to-sql-on-azure-vm-guide/review-summary.png)
- ![Review summary](./media/oracle-to-sql-on-azure-vm-guide/review-summary.png)
-1. After the scan completes, view the **Data Collection** summary report. The scan can take a few minutes, and depends on the number of databases. Select **Close** when finished:
+1. After the scan finishes, view the **Data Collection** summary. The scan might take a few minutes, depending on the number of databases. Select **Close** when you're done:
- ![Collection summary report](./media/oracle-to-sql-on-azure-vm-guide/collection-summary-report.png)
+ ![Screenshot that shows the Data Collection summary.](./media/oracle-to-sql-on-azure-vm-guide/collection-summary-report.png)
-1. Select **Options** to generate a report about the Oracle Assessment and database details. Select both options (one by one) to generate the report.
+1. Select **Options** to generate a report about the Oracle assessment and database details. Select both options, one at a time, to generate the report.
### Assess
-After identifying the data sources, use the [SQL Server Migration Assistant (SSMA) for Oracle](https://www.microsoft.com/en-us/download/details.aspx?id=54258) to assess the Oracle instance(s) migrating to the SQL Server VM so that you understand the gaps between the two. Using the migration assistant, you can review database objects and data, assess databases for migration, migrate database objects to SQL Server, and then migrate data to SQL Server.
+After you identify the data sources, use [SQL Server Migration Assistant for Oracle](https://www.microsoft.com/en-us/download/details.aspx?id=54258) to assess the Oracle instances migrating to the SQL Server VM. The assistant will help you understand the gaps between the source and destination databases. You can review database objects and data, assess databases for migration, migrate database objects to SQL Server, and then migrate data to SQL Server.
To create an assessment, follow these steps:
-1. Open the [SQL Server Migration Assistant (SSMA) for Oracle](https://www.microsoft.com/en-us/download/details.aspx?id=54258).
-1. Select **File** and then choose **New Project**.
-1. Provide a project name, a location to save your project, and then select a SQL Server migration target from the drop-down. Select **OK**:
- ![New project](./media/oracle-to-sql-on-azure-vm-guide/new-project.png)
+1. Open [SQL Server Migration Assistant for Oracle](https://www.microsoft.com/en-us/download/details.aspx?id=54258).
+1. On the **File** menu, select **New Project**.
+1. Provide a project name and a location for your project, and then select a SQL Server migration target from the list. Select **OK**:
-1. Select **Connect to Oracle**. Enter in values for Oracle connection details on the **Connect to Oracle** dialog box:
+ ![Screenshot that shows the New Project dialog box.](./media/oracle-to-sql-on-azure-vm-guide/new-project.png)
- ![Connect to Oracle](./media/oracle-to-sql-on-azure-vm-guide/connect-to-oracle.png)
- Select the Oracle schema(s) you want to migrate:
+1. Select **Connect to Oracle**. Enter values for the Oracle connection in the **Connect to Oracle** dialog box:
- ![Select Oracle schema](./media/oracle-to-sql-on-azure-vm-guide/select-schema.png)
+ ![Screenshot that shows the Connect to Oracle dialog box.](./media/oracle-to-sql-on-azure-vm-guide/connect-to-oracle.png)
-1. Right-click the Oracle schema you want to migrate in the **Oracle Metadata Explorer**, and then choose **Create report**. This will generate an HTML report. Alternatively, you can choose **Create report** from the navigation bar after selecting the database:
+ Select the Oracle schemas that you want to migrate:
- ![Create Report](./media/oracle-to-sql-on-azure-vm-guide/create-report.png)
+ ![Screenshot that shows the list of Oracle schemas that can be migrated.](./media/oracle-to-sql-on-azure-vm-guide/select-schema.png)
-1. In **Oracle Metadata Explorer**, select the Oracle schema, and then select **Create Report** to generate an HTML report with conversion statistics and error/warnings, if any.
-1. Review the HTML report to understand conversion statistics and any errors or warnings. You can also open the report in Excel to get an inventory of Oracle objects and the effort required to perform schema conversions. The default location for the report is in the report folder within SSMAProjects.
+
+1. In **Oracle Metadata Explorer**, right-click the Oracle schema that you want to migrate, and then select **Create Report**. Doing so will generate an HTML report. Alternatively, you can select the database and then select **Create report** in the top menu.
+
+ ![Screenshot that shows how to create a report.](./media/oracle-to-sql-on-azure-vm-guide/create-report.png)
+
+1. Review the HTML report for conversion statistics, errors, and warnings. Analyze it to understand conversion problems and resolutions.
+
+ You can also open the report in Excel to get an inventory of Oracle objects and the effort required to complete schema conversions. The default location for the report is the report folder in SSMAProjects.
For example: `drive:\<username>\Documents\SSMAProjects\MyOracleMigration\report\report_2016_11_12T02_47_55\`
-
- ![Conversion Report](./media/oracle-to-sql-on-azure-vm-guide/conversion-report.png)
++
+ ![Screenshot that shows a conversion report.](./media/oracle-to-sql-on-azure-vm-guide/conversion-report.png)
+ ### Validate data types
-Validate the default data type mappings and change them based on requirements if necessary. To do so, follow these steps:
+Validate the default data type mappings and change them based on requirements, if necessary. To do so, follow these steps:
+
-1. Select **Tools** from the menu.
-1. Select **Project Settings**.
-1. Select the **Type mappings** tab:
+1. On the **Tools** menu, select **Project Settings**.
+1. Select the **Type Mappings** tab.
- ![Type Mappings](./media/oracle-to-sql-on-azure-vm-guide/type-mappings.png)
+ ![Screenshot that shows the Type Mappings tab.](./media/oracle-to-sql-on-azure-vm-guide/type-mappings.png)
-1. You can change the type mapping for each table by selecting the table in the **Oracle Metadata explorer**.
+1. You can change the type mapping for each table by selecting the table in **Oracle Metadata Explorer**.
-### Convert schema
+### Convert the schema
To convert the schema, follow these steps:
-1. (Optional) To convert dynamic or ad-hoc queries, right-click the node and choose **Add statement**.
-1. Select **Connect to SQL Server** from the top-line navigation bar.
+1. (Optional) To convert dynamic or ad-hoc queries, right-click the node and select **Add statement**.
+
+1. Select **Connect to SQL Server** in the top menu.
1. Enter connection details for your SQL Server on Azure VM.
- 1. Choose your target database from the drop-down, or provide a new name, in which case a database will be created on the target server.
+ 1. Select your target database from the list, or provide a new name. If you provide a new name, a database will be created on the target server.
1. Provide authentication details. 1. Select **Connect**.
- ![Connect to SQL](./media/oracle-to-sql-on-azure-vm-guide/connect-to-sql-vm.png)
-1. Right-click the Oracle schema in the **Oracle Metadata Explorer** and choose **Convert Schema**. Alternatively, you can select **Convert schema** from the top-line navigation bar:
+ ![Screenshot that shows how to connect to SQL Server.](./media/oracle-to-sql-on-azure-vm-guide/connect-to-sql-vm.png)
+
+1. Right-click the Oracle schema in **Oracle Metadata Explorer** and select **Convert Schema**. Alternatively, you can select **Convert schema** in the top menu:
- ![Convert Schema](./media/oracle-to-sql-on-azure-vm-guide/convert-schema.png)
+ ![Screenshot that shows how to convert the schema.](./media/oracle-to-sql-on-azure-vm-guide/convert-schema.png)
-1. After the conversion completes, compare and review the converted objects to the original objects to identify potential problems and address them based on the recommendations:
- ![Review recommendations](./media/oracle-to-sql-on-azure-vm-guide/table-mapping.png)
+1. After the schema conversion is complete, review the converted objects and compare them to the original objects to identify potential problems. Use the recommendations to address any problems:
+
+ ![Screenshot that shows a comparison of two schemas.](./media/oracle-to-sql-on-azure-vm-guide/table-mapping.png)
Compare the converted Transact-SQL text to the original stored procedures and review the recommendations:
- ![Review recommendations code](./media/oracle-to-sql-on-azure-vm-guide/procedure-comparison.png)
+ ![Screenshot that shows Transact-SQL, stored procedures, and a warning.](./media/oracle-to-sql-on-azure-vm-guide/procedure-comparison.png)
- You can save the project locally for an offline schema remediation exercise. You can do so by selecting **Save Project** from the **File** menu. This gives you an opportunity to evaluate the source and target schemas offline and perform remediation before you can publish the schema to SQL Server.
+ You can save the project locally for an offline schema remediation exercise. To do so, select **Save Project** on the **File** menu. Saving the project locally lets you evaluate the source and target schemas offline and perform remediation before you publish the schema to SQL Server.
-1. Select **Review results** in the Output pane, and review errors in the **Error list** pane.
-1. Save the project locally for an offline schema remediation exercise. Select **Save Project** from the **File** menu. This gives you an opportunity to evaluate the source and target schemas offline and perform remediation before you can publish the schema to SQL Server on Azure VM.
+1. Select **Review results** in the **Output** pane, and then review errors in the **Error list** pane.
+1. Save the project locally for an offline schema remediation exercise. Select **Save Project** on the **File** menu. This gives you an opportunity to evaluate the source and target schemas offline and perform remediation before you publish the schema to SQL Server on Azure Virtual Machines.
## Migrate
-After you have the necessary prerequisites in place and have completed the tasks associated with the **Pre-migration** stage, you are ready to perform the schema and data migration. Migration involves two steps ΓÇô publishing the schema and migrating the data.
+After you have the necessary prerequisites in place and have completed the tasks associated with the pre-migration stage, you're ready to start the schema and data migration. Migration involves two steps: publishing the schema and migrating the data.
To publish your schema and migrate the data, follow these steps:
-1. Publish the schema: Right-click the database from the **SQL Server Metadata Explorer** and choose **Synchronize with Database**. This action publishes the Oracle schema to SQL Server on Azure VM:
+1. Publish the schema: right-click the database in **SQL Server Metadata Explorer** and select **Synchronize with Database**. Doing so publishes the Oracle schema to SQL Server on Azure Virtual Machines.
- ![Synchronize with database](./media/oracle-to-sql-on-azure-vm-guide/synchronize-database.png)
+ ![Screenshot that shows the Synchronize with Database command.](./media/oracle-to-sql-on-azure-vm-guide/synchronize-database.png)
Review the mapping between your source project and your target:
- ![Review synchronization status](./media/oracle-to-sql-on-azure-vm-guide/synchronize-database-review.png)
+ ![Screenshot that shows the synchronization status.](./media/oracle-to-sql-on-azure-vm-guide/synchronize-database-review.png)
+
-1. Migrate the data: Right-click the database or object you want to migrate in **Oracle Metadata Explorer**, and choose **Migrate data**. Alternatively, you can select **Migrate Data** from the top-line navigation bar. To migrate data for an entire database, select the check box next to the database name. To migrate data from individual tables, expand the database, expand Tables, and then select the check box next to the table. To omit data from individual tables, clear the check box:
+1. Migrate the data: right-click the database or object that you want to migrate in **Oracle Metadata Explorer** and select **Migrate Data**. Alternatively, you can select **Migrate Data** in the top menu.
- ![Migrate Data](./media/oracle-to-sql-on-azure-vm-guide/migrate-data.png)
+ To migrate data for an entire database, select the check box next to the database name. To migrate data from individual tables, expand the database, expand **Tables**, and then select the check box next to the table. To omit data from individual tables, clear appropriate the check boxes.
-1. Provide connection details for Oracle and SQL Server on Azure VM at the dialog box.
-1. After migration completes, view the **Data Migration Report**:
+ ![Screenshot that shows the Migrate Data command.](./media/oracle-to-sql-on-azure-vm-guide/migrate-data.png)
- ![Data Migration Report](./media/oracle-to-sql-on-azure-vm-guide/data-migration-report.png)
+1. Provide connection details for Oracle and SQL Server on Azure Virtual Machines in the dialog box.
+1. After the migration finishes, view the **Data Migration Report**:
-1. Connect to your SQL Server on Azure VM instance by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) and validate the migration by reviewing the data and schema:
+ ![Screenshot that shows the Data Migration Report.](./media/oracle-to-sql-on-azure-vm-guide/data-migration-report.png)
- ![Validate in SSMA](./media/oracle-to-sql-on-azure-vm-guide/validate-in-ssms.png)
+1. Connect to your SQL Server on Azure Virtual Machines instance by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms). Validate the migration by reviewing the data and schema:
-In addition to using SSMA, you can also use SQL Server Integration Services (SSIS) to migrate the data. To learn more, see:
-- The article [Getting Started with SQL Server Integration Services](//sql/integration-services/sql-server-integration-services).-- The white paper [SQL Server Integration
+ ![Screenshot that shows a SQL Server instance in SSMA.](./media/oracle-to-sql-on-azure-vm-guide/validate-in-ssms.png)
+Instead of using SSMA, you could use SQL Server Integration Services (SSIS) to migrate the data. To learn more, see:
+- The article [SQL Server Integration Services](https://docs.microsoft.com//sql/integration-services/sql-server-integration-services).
+- The white paper [SSIS for Azure and Hybrid Data Movement](https://download.microsoft.com/download/D/2/0/D20E1C5F-72EA-4505-9F26-FEF9550EFD44/SSIS%20Hybrid%20and%20Azure.docx).
## Post-migration
-After you have successfully completed the **Migration** stage, you need to go through a series of post-migration tasks to ensure that everything is functioning as smoothly and efficiently as possible.
+After you complete the migration stage, you need to complete a series of post-migration tasks to ensure that everything is running as smoothly and efficiently as possible.
### Remediate applications
-After the data is migrated to the target environment, all the applications that formerly consumed the source need to start consuming the target. Accomplishing this will in some cases require changes to the applications.
+After the data is migrated to the target environment, all the applications that previously consumed the source need to start consuming the target. Making those changes might require changes to the applications.
-The [Data Access Migration Toolkit](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit) is an extension for Visual Studio Code that allows you to analyze your Java source code and detect data access API calls and queries, providing you with a single-pane view of what needs to be addressed to support the new database back end. To learn more, see the [Migrate our Java application from Oracle](https://techcommunity.microsoft.com/t5/microsoft-data-migration/migrate-your-java-applications-from-oracle-to-sql-server-with/ba-p/368727) blog.
+[Data Access Migration Toolkit](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit) is an extension for Visual Studio Code. It allows you to analyze your Java source code and detect data access API calls and queries. The toolkit provides a single-pane view of what needs to be addressed to support the new database back end. To learn more, see [Migrate your Java application from Oracle](https://techcommunity.microsoft.com/t5/microsoft-data-migration/migrate-your-java-applications-from-oracle-to-sql-server-with/ba-p/368727).
### Perform tests
-The test approach for database migration consists of performing the following activities:
+To test your database migration, complete these activities:
-1. **Develop validation tests**. To test database migration, you need to use SQL queries. You must create the validation queries to run against both the source and the target databases. Your validation queries should cover the scope you have defined.
+1. **Develop validation tests**. To test database migration, you need to use SQL queries. Create the validation queries to run against both the source and target databases. Your validation queries should cover the scope that you've defined.
-2. **Set up test environment**. The test environment should contain a copy of the source database and the target database. Be sure to isolate the test environment.
+2. **Set up a test environment**. The test environment should contain a copy of the source database and the target database. Be sure to isolate the test environment.
3. **Run validation tests**. Run the validation tests against the source and the target, and then analyze the results.
The test approach for database migration consists of performing the following ac
### Optimize
-The post-migration phase is crucial for reconciling any data accuracy issues and verifying completeness, as well as addressing performance issues with the workload.
+The post-migration phase is crucial for reconciling any data accuracy problems and verifying completeness. It's also critical for addressing performance issues with the workload.
> [!Note]
-> For additional detail about these issues and specific steps to mitigate them, see the [Post-migration Validation and Optimization Guide](/sql/relational-databases/post-migration-validation-and-optimization-guide).
+> For more information about these problems and specific steps to mitigate them, see the [Post-migration validation and optimization guide](/sql/relational-databases/post-migration-validation-and-optimization-guide).
-## Migration assets
+## Migration resources
-For additional assistance with completing this migration scenario, please see the following resources, which were developed in support of a real-world migration project engagement.
+For more help with completing this migration scenario, see the following resources, which were developed to support a real-world migration project.
-| **Title/link** | **Description** |
+| **Title/Link** | **Description** |
| - | -- |
-| [Data Workload Assessment Model and Tool](https://github.com/Microsoft/DataMigrationTeam/tree/master/Data%20Workload%20Assessment%20Model%20and%20Tool) | This tool provides suggested ΓÇ£best fitΓÇ¥ target platforms, cloud readiness, and application/database remediation level for a given workload. It offers simple, one-click calculation and report generation that greatly helps to accelerate large estate assessments by providing and automated and uniform target platform decision process. |
-| [Oracle Inventory Script Artifacts](https://github.com/Microsoft/DataMigrationTeam/tree/master/Oracle%20Inventory%20Script%20Artifacts) | This asset includes a PL/SQL query that hits Oracle system tables and provides a count of objects by schema type, object type, and status. It also provides a rough estimate of ΓÇÿRaw DataΓÇÖ in each schema and the sizing of tables in each schema, with results stored in a CSV format. |
-| [Automate SSMA Oracle Assessment Collection & Consolidation](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Automate%20SSMA%20Oracle%20Assessment%20Collection%20%26%20Consolidation) | This set of resource uses a .csv file as entry (sources.csv in the project folders) to produce the xml files that are needed to run SSMA assessment in console mode. The source.csv is provided by the customer based on an inventory of existing Oracle instances. The output files are AssessmentReportGeneration_source_1.xml, ServersConnectionFile.xml, and VariableValueFile.xml.|
-| [SSMA for Oracle Common Errors and how to fix them](https://aka.ms/dmj-wp-ssma-oracle-errors) | With Oracle, you can assign a non-scalar condition in the WHERE clause. However, SQL Server doesnΓÇÖt support this type of condition. As a result, SQL Server Migration Assistant (SSMA) for Oracle doesnΓÇÖt convert queries with a non-scalar condition in the WHERE clause, instead generating an error O2SS0001. This white paper provides more details on the issue and ways to resolve it. |
-| [Oracle to SQL Server Migration Handbook](https://github.com/microsoft/DataMigrationTeam/blob/master/Whitepapers/Oracle%20to%20SQL%20Server%20Migration%20Handbook.pdf) | This document focuses on the tasks associated with migrating an Oracle schema to the latest version of SQL Server. If the migration requires changes to features/functionality, then the possible impact of each change on the applications that use the database must be considered carefully. |
+| [Data Workload Assessment Model and Tool](https://github.com/Microsoft/DataMigrationTeam/tree/master/Data%20Workload%20Assessment%20Model%20and%20Tool) | This tool provides suggested best-fit target platforms, cloud readiness, and application/database remediation levels for a given workload. It offers simple one-click calculation and report generation that helps to accelerate large estate assessments by providing an automated and uniform target-platform decision process. |
+| [Oracle Inventory Script Artifacts](https://github.com/Microsoft/DataMigrationTeam/tree/master/Oracle%20Inventory%20Script%20Artifacts) | This asset includes a PL/SQL query that targets Oracle system tables and provides a count of objects by schema type, object type, and status. It also provides a rough estimate of raw data in each schema and the sizing of tables in each schema, with results stored in a CSV format. |
+| [Automate SSMA Oracle Assessment Collection & Consolidation](https://github.com/microsoft/DataMigrationTeam/tree/master/IP%20and%20Scripts/Automate%20SSMA%20Oracle%20Assessment%20Collection%20%26%20Consolidation) | This set of resources uses a .csv file as entry (sources.csv in the project folders) to produce the XML files that you need to run an SSMA assessment in console mode. You provide the source.csv file by taking an inventory of existing Oracle instances. The output files are AssessmentReportGeneration_source_1.xml, ServersConnectionFile.xml, and VariableValueFile.xml.|
+| [SSMA issues and possible remedies when migrating Oracle databases](https://aka.ms/dmj-wp-ssma-oracle-errors) | With Oracle, you can assign a non-scalar condition in a WHERE clause. SQL Server doesn't support this type of condition. So SSMA for Oracle doesn't convert queries that have a non-scalar condition in the WHERE clause. Instead, it generates an error: O2SS0001. This white paper provides details on the problem and ways to resolve it. |
+| [Oracle to SQL Server Migration Handbook](https://github.com/microsoft/DataMigrationTeam/blob/master/Whitepapers/Oracle%20to%20SQL%20Server%20Migration%20Handbook.pdf) | This document focuses on the tasks associated with migrating an Oracle schema to the latest version of SQL Server. If the migration requires changes to features/functionality, you need to carefully consider the possible effect of each change on the applications that use the database. |
++
+The Data SQL Engineering team developed these resources. This team's core charter is to unblock and accelerate complex modernization for data-platform migration projects to the Microsoft Azure data platform.
-The Data SQL Engineering team developed these resources. This team's core charter is to unblock and accelerate complex modernization for data platform migration projects to Microsoft's Azure data platform.
## Next steps -- To check the availability of services applicable to SQL Server see the [Azure Global infrastructure center](https://azure.microsoft.com/global-infrastructure/services/?regions=all&amp;products=synapse-analytics,virtual-machines,sql-database)
+- To check the availability of services applicable to SQL Server, see the [Azure Global infrastructure center](https://azure.microsoft.com/global-infrastructure/services/?regions=all&amp;products=synapse-analytics,virtual-machines,sql-database).
-- For a matrix of the Microsoft and third-party services and tools that are available to assist you with various database and data migration scenarios as well as specialty tasks, see the article [Service and tools for data migration.](../../../dms/dms-tools-matrix.md)
+- For a matrix of the Microsoft and third-party services and tools that are available to help you with various database and data migration scenarios and specialized tasks, see [Services and tools for data migration](../../../dms/dms-tools-matrix.md).
-- To learn more about Azure SQL see:
+- To learn more about Azure SQL, see:
- [Deployment options](../../azure-sql-iaas-vs-paas-what-is-overview.md)
- - [SQL Server on Azure VMs](../../virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md)
- - [Azure total Cost of Ownership Calculator](https://azure.microsoft.com/pricing/tco/calculator/)
+ - [SQL Server on Azure Virtual Machines](../../virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md)
+ - [Azure total Cost of Ownership Calculator](https://azure.microsoft.com/pricing/tco/calculator/)
-- To learn more about the framework and adoption cycle for Cloud migrations, see
+- To learn more about the framework and adoption cycle for cloud migrations, see:
- [Cloud Adoption Framework for Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/contoso-migration-scale)
- - [Best practices for costing and sizing workloads migrate to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs)
+ - [Best practices to cost and size workloads migrated to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs)
-- For information about licensing, see
+- For information about licensing, see:
- [Bring your own license with the Azure Hybrid Benefit](../../virtual-machines/windows/licensing-model-azure-hybrid-benefit-ahb-change.md) - [Get free extended support for SQL Server 2008 and SQL Server 2008 R2](../../virtual-machines/windows/sql-server-2008-extend-end-of-support.md)
+- To assess the application access layer, use [Data Access Migration Toolkit Preview](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit).
+- For details on how to do data access layer A/B testing, see [Overview of Database Experimentation Assistant](/sql/dea/database-experimentation-assistant-overview).
+ -- To assess the Application access layer, see [Data Access Migration Toolkit (Preview)](https://marketplace.visualstudio.com/items?itemName=ms-databasemigration.data-access-migration-toolkit)-- For details on how to perform Data Access Layer A/B testing see [Database Experimentation Assistant](/sql/dea/database-experimentation-assistant-overview).
backup Backup Support Matrix Iaas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-support-matrix-iaas.md
Backup of Azure VMs with locks | Unsupported for unmanaged VMs. <br><br> Support
[Spot VMs](../virtual-machines/spot-vms.md) | Unsupported. Azure Backup restores Spot VMs as regular Azure VMs. [Azure Dedicated Host](../virtual-machines/dedicated-hosts.md) | Supported Windows Storage Spaces configuration of standalone Azure VMs | Supported
+[Azure VM Scale Sets](../virtual-machine-scale-sets/virtual-machine-scale-sets-orchestration-modes.md#scale-sets-with-flexible-orchestration) | Supported for both uniform and flexible orchestration models to back up and restore Single Azure VM.
## VM storage support
batch Managed Identity Pools https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/managed-identity-pools.md
var poolParameters = new Pool(name: "yourPoolName")
"18.04-LTS", "latest"), "batch.node.ubuntu 18.04")
- };
+ },
Identity = new BatchPoolIdentity { Type = PoolIdentityType.UserAssigned,
batch Managed Identity Pools https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/best-practices-availability-paired-regions.md
Previously updated : 03/03/2020 Last updated : 03/30/2021
A regional pair consists of two regions within the same geography. Azure seriali
Some Azure services take further advantage of paired regions to ensure business continuity and to protect against data loss. Azure provides several [storage solutions](./storage/common/storage-redundancy.md#redundancy-in-a-secondary-region) that take advantage of paired regions to ensure data availability. For example, [Azure Geo-redundant Storage](./storage/common/storage-redundancy.md#geo-redundant-storage) (GRS) replicates data to a secondary region automatically, ensuring that data is durable even in the event that the primary region isn't recoverable.
-Note that not all Azure services automatically replicate data, nor do all Azure services automatically fall-back from a failed region to its pair. In such cases, recovery and replication must be configured by the customer.
+Note that not all Azure services automatically replicate data, nor do all Azure services automatically fallback from a failed region to its pair. In such cases, recovery and replication must be configured by the customer.
## Can I select my regional pairs?
No. Customers can leverage Azure services to architect a resilient service witho
|: |: |: | | Asia-Pacific |East Asia (Hong Kong) | Southeast Asia (Singapore) | | Australia |Australia East |Australia Southeast |
-| Australia |Australia Central |Australia Central 2 |
+| Australia |Australia Central |Australia Central 2* |
| Brazil |Brazil South |South Central US |
+| Brazil |Brazil Southeast* |Brazil South |
| Canada |Canada Central |Canada East | | China |China North |China East| | China |China North 2 |China East 2| | Europe |North Europe (Ireland) |West Europe (Netherlands) |
-| France |France Central|France South|
-| Germany |Germany Central |Germany Northeast |
+| France |France Central|France South*|
+| Germany |Germany West Central |Germany North* |
| India |Central India |South India | | India |West India |South India | | Japan |Japan East |Japan West |
No. Customers can leverage Azure services to architect a resilient service witho
| North America |East US 2 |Central US | | North America |North Central US |South Central US | | North America |West US 2 |West Central US |
-| Norway | Norway East | Norway West |
-| South Africa | South Africa North |South Africa West |
-| Switzerland | Switzerland North |Switzerland West |
+| Norway | Norway East | Norway West* |
+| South Africa | South Africa North |South Africa West* |
+| Switzerland | Switzerland North |Switzerland West* |
| UK |UK West |UK South |
-| United Arab Emirates | UAE North | UAE Central
-| US Department of Defense |US DoD East |US DoD Central |
-| US Government |US Gov Arizona |US Gov Texas |
-| US Government |US Gov Iowa |US Gov Virginia |
-| US Government |US Gov Virginia |US Gov Texas |
+| United Arab Emirates | UAE North | UAE Central* |
+| US Department of Defense |US DoD East* |US DoD Central* |
+| US Government |US Gov Arizona* |US Gov Texas* |
+| US Government |US Gov Iowa* |US Gov Virginia* |
+| US Government |US Gov Virginia* |US Gov Texas* |
+
+(*) Certain regions are access restricted to support specific customer scenarios, for example in-country disaster recovery. These regions are available only upon request by [creating a new support request in the Azure portal](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest).
> [!Important] > - West India is paired in one direction only. West India's secondary region is South India, but South India's secondary region is Central India.
cdn Cdn Caching Rules Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-caching-rules-tutorial.md
Last updated 04/20/2018
-# As an Azure CDN administrator, I want to create custom rules on my CDN endpoint so that I can control how content is cached.
+#Customer intent: As an Azure CDN administrator, I want to create custom rules on my CDN endpoint so that I can control how content is cached.
cdn Cdn Custom Ssl https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-custom-ssl.md
Last updated 03/26/2021
-# As a website owner, I want to enable HTTPS on the custom domain of my CDN endpoint so that my users can use my custom domain to access my content securely.
-
+#Customer intent: As a website owner, I want to enable HTTPS on the custom domain of my CDN endpoint so that my users can use my custom domain to access my content securely.
+ # Tutorial: Configure HTTPS on an Azure CDN custom domain This tutorial shows how to enable the HTTPS protocol for a custom domain that's associated with an Azure CDN endpoint.
cdn Cdn Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-features.md
The following table compares the features available with each product.
| **Ease of use** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** | | Easy integration with Azure services, such as [Storage](cdn-create-a-storage-account-with-cdn.md), [Web Apps](cdn-add-to-web-app.md), and [Media Services](../media-services/previous/media-services-portal-manage-streaming-endpoints.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | | Management via [REST API](/rest/api/cdn/), [.NET](cdn-app-dev-net.md), [Node.js](cdn-app-dev-node.md), or [PowerShell](cdn-manage-powershell.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** |
-| [Compression MIME types](./cdn-improve-performance.md) |Default only |Configurable |Configurable |Configurable |
+| [Compression MIME types](./cdn-improve-performance.md) |Configurable |Configurable |Configurable |Configurable |
| Compression encodings |gzip, brotli |gzip |gzip, deflate, bzip2, brotli |gzip, deflate, bzip2, brotli | ## Migration
cdn Cdn Map Content To Custom Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cdn/cdn-map-content-to-custom-domain.md
Last updated 02/04/2020
-# As a website owner, I want to add a custom domain to my CDN endpoint so that my users can use my custom domain to access my content.
+#Customer intent: As a website owner, I want to add a custom domain to my CDN endpoint so that my users can use my custom domain to access my content.
+ # Tutorial: Add a custom domain to your endpoint This tutorial shows how to add a custom domain to an Azure Content Delivery Network (CDN) endpoint.
certification Concepts Marketing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/concepts-marketing.md
+
+ Title: Marketing properties
+description: A description of the different marketing fields collected in the portal and how they will appear on the Azure Certified Device catalog
++++ Last updated : 03/15/2021++
+# Marketing properties
+
+In the process of [adding your device details](tutorial-02-adding-device-details.md), you will be required to supply marketing information that will be displayed on the [Azure Certified Device catalog](https://devicecatalog.azure.com). This information is collected within the Azure Certified Device portal during the certification submission process and will be used as filter parameters on the catalog. This article provides a mapping between the fields collected in the portal to how they appear on the catalog. After reading this article, partners should better understand what information to provide during the certification process to best represent their product on the catalog.
+
+![PDP overview](./media/concepts-marketing/pdp-overview.png)
+
+## Azure Certified Device catalog product tile
+
+Visitors to the catalog will first interact with your device as a catalog product tile on the search page. This will provide a basic overview of the device and certifications it has been awarded.
+
+![Product tile template](./media/concepts-marketing/product-tile.png)
+
+| Field | Description | Where to add in the portal |
+||-|-|
+| Device Name | Public name of your certified device | Basics tab of Device details|
+| Company name| Public name of your company | Not editable in the portal. Extracted from MPN account name |
+| Product photo | Image of your device with minimum resolution 200p x 200p | Marketing details |
+| Certification classification | Mandatory Azure Certified Device certification label and optional certification badges | Basics tab of Device details. Must pass appropriate testing in Connect & test section. |
+
+## Product description page information
+
+Once a customer has clicked on your device tile from the catalog search page, they will be navigated to the product description page of your device. This is where the bulk of the information provided during the certification process will be found.
+
+The top of the product description page highlights key characteristics, some of which were already used for the product tile.
+
+![PDP top bar](./media/concepts-marketing/pdp-top.png)
+
+| Field | Description | Where to add in the portal |
+||-|-|
+| Device class | Classification of the form factor and primary purpose of your device ([Learn more](./resources-glossary.md)) | Basics tab of Device details|
+| Device type | Classification of device based on implementation readiness ([Learn more](./resources-glossary.md)) | Basics tab of Device details |
+| Geo availability | Regions that your device is available for purchase | Marketing details |
+| Operating systems | Operating system(s) that your device supports | Product details tab of Device details |
+| Target industries | Top 3 industries that your device is optimized for | Marketing details |
+| Product description | Free text field for you to write your marketing description of your product. This can capture details not listed in the portal, or add additional context for the benefits of using your device. | Marketing details|
+
+The remainder of the page is focused on displaying the technical specifications of your device in table format that will help your customer better understand your product. For convenience, the information displayed at the top of the page is also listed here. The rest of the table is sectioned by the components specified in the portal.
+
+![PDP bottom page](./media/concepts-marketing/pdp-bottom.png)
+
+| Field | Description | Where to add in the portal |
+||-|-|
+| Component type | Classification of the form factor and primary purpose of your device ([Learn more](./resources-glossary.md)) | Product details of Device details|
+| Component name| Name of the component you are describing | Product details of Device details |
+| Additional component information | Additional hardware specifications such as included sensors, connectivity, accelerators, etc. | Additional component information of Device details ([Learn more](./how-to-using-the-components-feature.md)) |
+| Device dependency text | Partner-provided text describing the different dependencies the product requires to connect to Azure ([Learn more](./how-to-indirectly-connected-devices.md)) | Customer-facing comments section of Dependencies tab of Device details |
+| Device dependency link | Link to a certified device that your current product requires | Dependencies tab of Device details |
+
+## Shop links
+Available both on the product tile and product description page is a Shop button. When clicked by the customer, a window opens that allows them to select a distributor (you are allowed to list up to 5 distributors). Once selected, the customer is redirected to the partner-provided URL.
+
+![Image of Shop pop-up experience](./media/concepts-marketing/shop.png)
+
+| Field | Description | Where to add in the portal |
+||-|-|
+| Distributor name | Name of the distributor who is selling your product | Marketing details|
+| Get Device| Link to external website for customer to purchase the device (or request a quote from the distributor). This may be the same as the Manufacturer's page if the distributor is the same as the device manufacturer. If a purchase page is not available, this will redirect to the distributor's page for customer to contact them directly. | Distributor product page URL in marketing details. If no purchase page is available, link will default to Distributor URL in Marketing detail. |
+
+## External links
+Also included within the Product Description page are links that navigate to partner-provided sites or files that help the customer better understand the product. They appear towards the top of the page, beneath the product description text. The links displayed will differ for different device types and certification programs.
+
+| Link | Description | Where to add in the portal |
+||-|-|
+| Get Started guide* | PDF file with user instructions to connect and use your device with Azure services | Add 'Get Started' guide section of the portal|
+| Manufacturer's page*|Link to manufacturer's page. This page may be the specific product page for your device, or to the company home page if a marketing page is not available. | Manufacturer's marketing page in Marketing details |
+| Device model | Public DTDL models for IoT Plug and Play solutions | Not editable in the portal. Device model must be uploaded to the ([public model repository](https://aka.ms/modelrepo) |
+| Device source code | URL to device source code for Dev Kit device types| Basics tab of Device details |
+
+ **Required for all published devices*
+
+## Next steps
+Now that you have an understanding of how we use the information you provide during certification, you are now ready to certify your device! Begin your certification project, or jump back into the device details stage to add your own marketing information.
+
+- [Start your certification journey](./tutorial-00-selecting-your-certification.md)
+- [Adding device details](./tutorial-02-adding-device-details.md)
certification How To Edit Published Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/how-to-edit-published-device.md
+
+ Title: How to edit your published Azure Certified Device
+description: A guide to edit you device information after you have certified and published your device through the Azure Certified Device program.
++++ Last updated : 03/03/2021+++
+# Edit your published device
+
+After your device has been certified and published to the Azure Certified Device catalog, you may need to update your device details. This may be due to an update to your distributor list, changes to purchase page URLs, or updates to the hardware specifications (such as operating system version or a new component addition). Using the Azure Certified Device portal, we make it easy to update your device information without removing your product from our catalog.
+
+## Prerequisites
+
+- You should be signed in and have an **approved** project for your device on the [Azure Certified Device portal](https://certify.azure.com). If you don't have a certified device, you can view this [tutorial](tutorial-01-creating-your-project.md) to get started.
+
+## Editing your published project
+
+On the project summary, you should notice that your project is in read-only mode since it has already been reviewed and accepted. To make changes, you will have to request an edit to your project and have the update reapproved by the Azure Certification team.
+
+1. Click the `Request Metadata Edit` button on the top of the page
+
+ ![Request metadata update](./media/images/request-metadata-edit.png)
+
+1. Acknowledge the notification on the page that you will be required to submit your product for review after editing.
+ > [!NOTE]
+ > By confirming this edit, you are **not** removing your device from the Azure Certified Device catalog if it has already been published. Your previous version of the product will remain on the catalog until you have republished your device.
+
+1. Once acknowledging this warning, you can edit your device details. Make sure to leave a note in the `Comments for Reviewer` section of `Device Details` of what has been changed.
+
+ ![Note of metadata edit](./media/images/edit-notes.png)
+
+1. On the project summary page, click `Submit for review` to have your changes reapproved by the Azure Certification team.
+1. After your changes have been reviewed and approved, you can then republish your changes to the catalog through the portal (See our [tutorial](./tutorial-04-publishing-your-device.md)).
+
+## Next steps
+
+You've now successfully edited your device on the Azure Certified Device catalog. You can check out your changes on the catalog, or certify another device!
+- [Azure Certified Device catalog](https://devicecatalog.azure.com/)
+- [Get started with certifying a device](./tutorial-01-creating-your-project.md)
certification How To Indirectly Connected Devices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/how-to-indirectly-connected-devices.md
+
+# Mandatory fields.
+ Title: Certifing device bundles and indirectly connected devices
+
+description: See how to submit an indirectly connected device for certification.
++ Last updated : 02/23/2021++++
+# Optional fields. Don't forget to remove # if you need a field.
+#
+#
+#
++
+# Device bundles and indirectly connected devices
+
+To support devices that interact with Azure through a device, SaaS or PaaS offerings, our submission portal (https://www.certify.azure.com), and device catalog (https://devicecatalog.azure.com) enable concepts of bundling and dependencies to promote and enable these device combinations access to our Azure Certified Device program.
+
+Depending on your product line and services offered, your situation may require a combination of these steps:
++
+![Create project dependencies](./media/indirect-connected-device/picture-1.png )
+## Sensors and indirect devices
+Many sensors require a device to connect to Azure. In addition, you may have multiple compatible devices that will work with the sensor device. **To accommodate these scenarios, you must first certify the device(s) before certifying the sensor that will pass information through them.**
+
+Example matrix of submission combinations
+![Submission example](./media/indirect-connected-device/picture-2.png )
+
+To certify your sensor, which requires a separate device:
+1. First, [certify the device](https://certify.azure.com) and publish to the Azure Certified Device Catalog
+ - If you have multiple, compatible passthrough devices (as in the example above), Submit them separately for certification and publish to the catalog as well
+2. With the sensor connected through the device, submit the sensor for certification
+ * In the ΓÇ£DependenciesΓÇ¥ tab of the ΓÇ£Device detailsΓÇ¥ section, set the following values
+ * Dependency type = ΓÇ£Hardware gatewayΓÇ¥
+ * Dependency URL = ΓÇ£URL link to the device on the device catalogΓÇ¥
+ * Used during testing = ΓÇ£YesΓÇ¥
+ * Add any Customer-facing comments that should be provided to a user who sees the product description in the device catalog. (example: ΓÇ£Series 100 devices are required for sensors to connect to AzureΓÇ¥)
+
+3. If you have more devices you would like added as optional for this device, you can select ΓÇ£+ Add additional dependencyΓÇ¥. Then follow the same guidance and note that it was not used during testing. In the Customer-facing comments, ensure your customers are aware that other devices are associated with this sensor are available (as an alternative to the device that was used during testing).
+
+![Alt text](./media/indirect-connected-device/picture-3.png "Hardware dependency type")
+
+## PaaS and SaaS offerings
+As part of your product portfolio, you may have devices that you certify, but your device also requires other services from your company or other third-party companies. To add this dependency, follow these steps:
+1. Start the submission process for your device
+2. In the ΓÇ£DependenciesΓÇ¥ tab, set the following values
+ - Dependency type = ΓÇ£Software serviceΓÇ¥
+ - Service name = ΓÇ£[your product name]ΓÇ¥
+ - Dependency URL = ΓÇ£URL link to a product page that describes the serviceΓÇ¥
+ - Add any customer facing comments that should be provided to a user who sees the product description in the Azure Certified Device Catalog
+3. If you have other software, services or hardware dependencies you would like added as optional for this device, you can select ΓÇ£+ Add additional dependencyΓÇ¥ and follow the same guidance.
+
+![Software dependency type](./media/indirect-connected-device/picture-4.png )
+
+## Bundled products
+Bundled product listings are simply the successful certification of a device with another components that will be sold as part of the bundle in one product listing. You have the ability to submit a device that includes extra components such as a temperature sensor and a camera sensor (#1) or you could submit a touch sensor that includes a passthrough device (#2). Through the ΓÇ£ComponentΓÇ¥ feature, you have the ability to add multiple components to your listing.
+
+If you intend to do this, you format the product listing image to indicate this product comes with other components. In addition, if your bundle requires additional services to certify, you will need to identify those through the services dependency.
+Example matrix of bundled products
+
+![Bundle submission example](./media/indirect-connected-device/picture-5.png )
+
+For a more detailed description on how to use the component functionality in the Azure Certified Device portal, see our [help documentation](./how-to-using-the-components-feature.md).
+
+If a device is a passthrough device with a separate sensor in the same product, create one component to reflect the passthrough device, and another component to reflect the sensor. Components can be added to your project in the Product details tab of the Device details section:
+
+![Adding components](./media/indirect-connected-device/picture-6.png )
+
+For the passthrough device, set the Component type as a Customer Ready Product, and fill in the other fields as relevant for your product. Example:
+
+![Component details](./media/indirect-connected-device/picture-7.png )
+
+For the sensor, add a second component, setting the Component type as Peripheral and Attachment method as Discrete. Example:
+
+![Second component details](./media/indirect-connected-device/picture-8.png )
+
+Once the Sensor component has been created, Edit the details, navigate to the Sensors tab, and then add the sensor details. Example:
+
+![Sensor details](./media/indirect-connected-device/picture-9.png )
+
+Complete your projects details and Submit your device for certification as normal.
+
certification How To Using The Components Feature https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/how-to-using-the-components-feature.md
+
+ Title: How to use the components feature in the Azure Certified Device portal
+description: A guide on how to best use the components feature of the Device details section to accurately describe your device
++++ Last updated : 03/03/2021+++
+# Add components on the portal
+
+While completing the [tutorial to add device details](tutorial-02-adding-device-details.md) to your certification project, you will be expected to describe the hardware specifications of your device. To do so, users can highlight multiple, separate hardware products (referred to as **components**) that make up your device. This enables you to better promote devices that come with additional hardware, and allows customers to find the right product by searching on the catalog based on these features.
+
+## Prerequisites
+
+- You should be signed in and have a project for your device created on the [Azure Certified Device portal](https://certify.azure.com). For more information, view the [tutorial](tutorial-01-creating-your-project.md).
+
+## How to add components
+
+Every project submitted for certification will include one **Customer Ready Product** component (which in many cases will represent the holistic product itself). To better understand the distinction of a Customer Ready Product component type, view our [certification glossary](./resources-glossary.md). All additional components are at your discretion to include to accurately capture your device.
+
+1. Select `Add a component` on the Product details tab.
+
+ ![Add a component link](./media/images/add-a-component-link.png)
+
+1. Complete relevant form fields for the component.
+
+ ![Component details section](./media/images/component-details-section.png)
+
+1. Save your information using the `Save Product Details` button at the bottom of the page:
+
+ ![Save Product Details button](./media/images/save-product-details-button.png)
+
+1. Once you have saved your component, you can further tailor the hardware capabilities it supports. Select the `Edit` link by the component name.
+
+ ![Edit Component button](./media/images/component-edit.png)
+
+1. Provide relevant hardware capability information where appropriate.
+
+ ![Image of editable component sections](./media/images/component-selection-area.png)
+
+ The editable component fields (shown above) include:
+
+ - **General**: Hardware details such as processors and secure hardware
+ - **Connectivity**: Connectivity options, protocols, and interfaces such as radio(s) and GPIO
+ - **Accelerators**: Specify hardware acceleration such as GPU and VPU
+ - **Sensors**: Specify available sensors such as GPS and vibration
+ - **Additional Specs**: Additional information about the device such as physical dimensions and storage/battery information
+
+1. Select `Save Product Details` at the bottom of the Product details page.
+
+## Component use requirements and recommendations
+
+You may have questions regarding how many components to include, or what component type to use. Below are examples of a few sample scenarios of devices that you may be certifying, and how you can use the components feature.
+
+| Product Type | No. Components | Component 1 / Attachment Type | Components 2+ / Attachment Type |
+|-||-|--|
+| Finished Product | 1 | Customer Ready Product, Discrete | N/A |
+| Finished Product with **detachable peripheral(s)** | 2 or more | Customer Ready Product, Discrete | Peripheral / Discrete or Integrated |
+| Finished Product with **integrated component(s)** | 2 or more | Customer Ready Product, Discrete | Select appropriate type / Discrete or integrated |
+| Solution-Ready Dev Kit | 2 or more | Customer Ready Product, Discrete | Select appropriate type / Discrete or integrated |
+
+## Example component usage
+
+Below are examples of how an OEM called Contoso would use the components feature to certify their product, called Falcon.
+
+1. Falcon is a complete stand-alone device that does not integrate into a larger product.
+ 1. No. of components: 1
+ 1. Component device type: Customer Ready Product
+ 1. Attachment type: Discrete
+
+ ![Image of customer ready product](./media/images/customer-ready-product.png)
+
+1. Falcon is a device that includes an integrated peripheral camera module manufactured by INC Electronics that connects via USB to Falcon.
+ 1. No. of components: 2
+ 1. Component device type: Customer Ready Product, Peripheral
+ 1. Attachment type: Discrete, Integrated
+
+ > [!Note]
+ > The peripheral component is considered integrated because it is not removable.
+
+ ![Image of peripheral example component](./media/images/peripheral.png)
+
+1. Falcon is a device that includes an integrated System on Module from INC Electronics that uses a built-in processor Apollo52 from company Espressif and has an ARM64 architecture.
+ 1. No. of components: 2
+ 1. Component device type: Customer Ready Product, System on Module
+ 1. Attachment type: Discrete, Integrated
+
+ > [!Note]
+ > The peripheral component is considered integrated because it is not removable. The SoM component would also include processor information.
+
+ ![Image of system on module example component ](./media/images/system-on-module.png)
+
+## Additional tips
+
+We've provided below more clarifications regarding our component usage policy. If you have any questions about appropriate component usage, contact our team at [iotcert@microsoft.com](mailto:iotcert@microsoft.com), and we'll be more than happy to help!
+
+1. A project must contain **only** one Customer Ready Product component. If you are certifying a project with two independent devices, those devices should be certified separately.
+1. It is primarily up to you to use (or not use) components to promote your device's capabilities to potential customers.
+1. During our review of your device, the Azure Certification team will only require at least one Customer Ready Product component to be listed. However, we may request edits to the component information if the details are not clear or appear to be lacking (for example, component manufacturer is not supplied for a Customer Ready Product type).
+
+## Next steps
+
+Now that you're ready to use our components feature, you're now ready to complete your device details or edit your project for further clarity.
+
+- [Tutorial: Adding device details](tutorial-02-adding-device-details.md)
+- [Editing your published device](how-to-edit-published-device.md)
+
certification Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/overview.md
+
+ Title: Overview of the Azure Certified Device program
+description: An overview of the Azure Certified Device program for our partners and customers
+++ Last updated : 03/03/2021++++
+# What is Azure Certified Device?
+
+Thank you for your interest in the Azure Certified Device program! This program is your one stop for easily differentiating, promoting, and finding IoT devices built to run on Azure. From intelligent cameras to connected sensors to edge infrastructure, this enhanced IoT device certification program helps device builders increase their product visibility and saves customers time in building solutions.
+
+## Our certification promise
+
+The Azure Certified Device program ensures customer solutions work great on Azure. It is a program that utilizes tools, services, and a catalog to share industry knowledge with our community of builders within the IoT ecosystem to help builders and customers alike.
+
+The three tenets of this program are:
+
+- **Giving customers confidence:** Customers can confidently purchase Azure certified devices that carry the Microsoft promise.
+
+- **Matchmaking customers with the right devices for them:** Device builders can set themselves apart with certification that highlights their unique capabilities, and customers can easily find the products that fit their needs.
+
+- **Promoting certified devices:** Device builders get increased visibility, contact with customers, and usage of MicrosoftΓÇÖs Azure Certified Device brand.
+
+## User roles
+
+The Azure Certified Device program serves two different audiences.
+
+1. **Device builders**: Easily differentiate your IoT device capabilities and gain access to a worldwide audience looking to reliably purchase devices built to run on Azure. Use the Azure Certified Device Catalog to increase product visibility and connect with customers by certifying your device.
+1. **Solution builders**: Confidently find and purchase IoT devices built to run on Azure, knowing they meet specific capabilities. Easily search and select the right certified device for your IoT solution on the [Azure Certified Device catalog](https://devicecatalog.azure.com/).
+
+## Our certification programs
+
+There are four different certifications available now! Each certification is focused on delivering a different customer value. Depending on the type of device and your target audience, you can choose which certification(s) is most applicable for you to apply for. Select the titles of each program to learn more about the program requirements.
+
+| Certification program | Overview |
+|-|
+| [Azure Certified Device](program-requirements-azure-certified-device.md) | Azure Certified Device certification validates that a device can connect with Azure IoT Hub and securely provision through the Device Provisioning Service (DPS). This certification reflects a device's functionality and interoperability, which are a **required baseline** for all other certifications. |
+| [IoT Plug and Play](program-requirements-pnp.md) | IoT Plug and Play certification, an incremental certification beyond the baseline Azure Certified Device certification, validates Digital Twin Definition Language version 2 (DTDL) and interaction based on your device model. It enables a seamless device-to-cloud integration experience and enables hardware partners to build devices that can seamlessly integrate without the need to write custom code. |
+| [Edge Managed](program-requirements-edge-managed.md) | Edge Managed certification, an incremental certification beyond the baseline Azure Certified Device certification, focuses on device management standards for Azure connected devices. |
+| [Edge Secured Core](program-requirements-edge-secured-core.md) | Edge Secured-core certification, an incremental certification beyond the baseline Azure Certified Device certification, is for IoT devices running a full operating system such as Linux or Windows 10 IoT. It validates devices meet additional security requirements around device identity, secure boot, operating system hardening, device updates, data protection, and vulnerability disclosures. |
+
+## How to certify your device
+
+Certifying a device involves four major steps on the [Azure Certified Device portal](https://certify.azure.com):
+
+1. Creating your project
+1. Providing hardware capability information
+1. Validating device functionality
+1. Submitting and completing the review process
+
+Once you have certified your device, you then can optionally complete two of the following activities:
+
+1. Publishing to the Azure Certified Device Catalog (optional)
+1. Updating your project after it has been approved/published (optional)
+
+## Next steps
+
+Ready to get started with your certification journey? View our resources below to begin certifying your device!
+
+- [Starting the certification process](tutorial-00-selecting-your-certification.md)
+- If you have additional questions or feedback, contact [the Azure Certified Device team](mailto:iotcert@microsoft.com).
certification Program Requirements Azure Certified Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/program-requirements-azure-certified-device.md
+
+ Title: Azure Certified Device Requirements
+description: Azure Certified Device program requirements
+++ Last updated : 03/15/2021++++
+# Azure Certified Device Requirements
+(previously known as IoT Hub)
+
+This document outlines the device specific capabilities that will be represented in the Azure Certified Device catalog. A capability is singular device attribute that may be software implementation or combination of software and hardware implementations.
+
+## Program Purpose
+
+Microsoft is simplifying IoT and Azure Certified Device certification is baseline certification program to ensure any device types are provisioned to Azure IoT Hub securely.
+
+Promise of Azure Certified Device certification are:
+
+1. Device support telemetry that works with IoT Hub
+2. Device support IoT Hub Device Provisioning Service (DPS) to securely provisioned to Azure IoT Hub
+3. Device supports easy input of target DPS ID scope transfer without requiring user to recompile embedded code.
+4. Optionally validates other elements such as cloud to device messages, direct methods and device twin
+
+## Requirements
+
+**[Required] Device to cloud: The purpose of test is to make sure devices that send telemetry works with IoT Hub**
+
+| **Name** | AzureCertified.D2C |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must send any telemetry schemas to IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.come) to execute the tests. Device to cloud (required): **1.** Validates that the device can send message to AICS managed IoT Hub **2.** User must specify the number and frequency of messages. **3.** AICS validates the telemetry is received by the Hub instance |
+| **Resources** | [Certification steps](./overview.md) (has all the additional resources) |
+
+**[Required] DPS: The purpose of test is to check the device implements and supports IoT Hub Device Provisioning Service with one of the three attestation methods**
+
+| **Name** | AzureCertified.DPS |
+| -- | |
+| **Target Availability** | New |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device supports easy input of target DPS ID scope ownership without needing to recompile the embedded code. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests to validate that the device supports DPS **1.** User must select one of the attestation methods (X.509, TPM and SAS key) **2.** Depending on the attestation method, user needs to take corresponding action such as **a)** Upload X.509 cert to AICS managed DPS scope **b)** Implement SAS key or endorsement key into the device |
+| **Resources** | [Device provisioning service overview](../iot-dps/about-iot-dps.md) |
+
+**[If implemented] Cloud to device: The purpose of test is to make sure messages can be sent from cloud to devices**
+
+| **Name** | AzureCertified.C2D |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must be able to Cloud to Device messages from IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute these tests.Cloud to device (if implemented): **1.** Validates that the device can receive message from IoT Hub **2.** AICS sends random message and validates via message ACK from the device |
+| **Resources** | **a)** [Certification steps](./overview.md) (has all the additional resources) **b)** [Send cloud to device messages from an IoT Hub](../iot-hub/iot-hub-devguide-messages-c2d.md) |
+
+**[If implemented] Direct methods: The purpose of test is to make sure devices works with IoT Hub and supports direct methods**
+
+| **Name** | AzureCertified.DirectMethods |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must be able to receive and reply commands requests from IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests. Direct methods (if implemented) **1.** User has to specify the method payload of direct method. **2.** AICS validates the specified payload request is sent from Hub and ACK message received by the device |
+| **Resources** | **a)** [Certification steps](./overview.md) (has all the additional resources) **b)** [Understand direct methods from IoT Hub](../iot-hub/iot-hub-devguide-direct-methods.md) |
+
+**[If implemented] Device twin property: The purpose of test is to make sure devices that send telemetry works with IoT Hub and supports some of the IoT Hub capabilities such as direct methods, and device twin property**
+
+| **Name** | AzureCertified.DeviceTwin |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must send any telemetry schemas to IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests. Device twin property (if implemented) **1.** AICS validates the read/write-able property in device twin JSON **2.** User has to specify the JSON payload to be changed **3.** AICS validates the specified desired properties sent from IoT Hub and ACK message received by the device |
+| **Resources** | **a)** [Certification steps](./overview.md) (has all the additional resources) **b)** [Use device twins with IoT Hub](../iot-hub/iot-hub-devguide-device-twins.md) |
certification Program Requirements Edge Managed https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/program-requirements-edge-managed.md
+
+ Title: Edge Managed Certification Requirements
+description: Edge Managed Certification program requirements
+++ Last updated : 03/15/2021++++
+# Azure Certification Edge Managed
+
+This document outlines the device specific capabilities that will be represented in the Azure Certified Device catalog. A capability is singular device attribute that may describe the device.
+
+## Program Purpose
+
+Edge Managed certification, an incremental certification beyond the baseline Azure Certified Device certification. Edge Managed focuses on device management standards for Azure connected devices and validates the IoT Edge runtime compatibility for module deployment and management. (Previously, this program was identified as the IoT Edge certification program.)
+
+Edge Managed certification validates IoT Edge runtime compatibility for module deployment and management. This program provides confidence in the management of Azure connected IoT devices.
+
+## Requirements
+
+The Edge Managed certification requires that all requirements from the [Azure Certified Device baseline program](.\program-requirements-azure-certified-device.md).
+
+**DPS: The purpose of test is to check the device implements and supports IoT Hub Device Provisioning Service with one of the three attestation methods**
+
+| **Name** | AzureReady.DPS |
+| -- | |
+| **Target Availability** | Ignite (in preview) |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | AICS validates the device code support DPS. **1.** User has to select one of the attestation methods (X.509, TPM and SAS key). **2.** Depending on the attestation method, user needs to take corresponding action such as **a)** Upload X.509 cert to AICS managed DPS scope **b)** Implement SAS key or endorsement key into the device. **3.** Then, user will hit ΓÇÿConnectΓÇÖ button to connect to AICS managed IoT Hub via DPS |
+| **Resources** | |
+| **Azure Recommended:** | N/A |
+
+## IoT Edge
+
+**Edge runtime exists: The purpose of test is to make sure the device contains IoT Edge runtime ($edgehub and $edgeagent) are functioning correctly.**
+
+| **Name** | EdgeManaged.EdgeRT |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | IoT Edge device |
+| **OS** | [Tier1 and Tier2 OS](../iot-edge/support.md) |
+| **Validation Type** | Automated |
+| **Validation** | AICS validates the deploy-ability of the installed IoT Edge RT. **1.** User needs to specify specific OS (OS not on the list of Tier1/2 are not accepted) **2.** AICS generates its config.yaml and deploys canonical [simulated temp sensor edge module](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/azure-iot.simulated-temperature-sensor?tab=Overview) **3.** AICS validates that docker compatible container subsystem (Moby) is installed on the device **4.** Test result is determined based on successful deployment of the simulated temp sensor edge module and functionality of docker compatible container subsystem |
+| **Resources** | **a)** [AICS blog](https://azure.microsoft.com/en-in/blog/expanding-azure-iot-certification-service-to-support-azure-iot-edge-device-certification/), **b)** [Certification steps](./overview.md) (has all the additional resources), **c)** [Requirements](./program-requirements-azure-certified-device.md) |
+| **Azure Recommended:** | N/A |
+
+### Capability Template:
+
+**IoT Edge easy setup: The purpose of test is to make sure IoT Edge device is easy to set up and validates IoT Edge runtime is preinstalled during physical device validation**
+
+| **Name** | EdgeManaged.PhysicalDevice |
+| -- | |
+| **Target Availability** | Available now (currently on hold due to COVID-19) |
+| **Applies To** | IoT Edge device |
+| **OS** | [Tier1 and Tier2 OS](../iot-edge/support.md) |
+| **Validation Type** | Manual / Lab Verified |
+| **Validation** | OEM must ship the physical device to IoT administration (HCL). HCL performs manual validation on the physical device to check: **1.** EdgeRT is using Moby subsystem (allowed redistribution version). Not docker **2.** Pick the latest edge module to validate ability to deploy edge. |
+| **Resources** | **a)** [AICS blog](https://azure.microsoft.com/en-in/blog/expanding-azure-iot-certification-service-to-support-azure-iot-edge-device-certification/), **b)** [Certification steps](./overview.md) , **c)** [Requirements](./program-requirements-azure-certified-device.md) |
+| **Azure Recommended:** | N/A |
certification Program Requirements Edge Secured Core https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/program-requirements-edge-secured-core.md
+
+ Title: Edge Secured-core Certification Requirements
+description: Edge Secured-core Certification program requirements
+++ Last updated : 03/15/2021++++
+# Azure Certified Device - Edge Secured-core (Preview) #
+
+## Edge Secured-Core Certification Requirements ##
+
+This document outlines the device specific capabilities and requirements that will be met in order to complete certification and list a device in the Azure IoT Device catalog with the Edge Secured-core label.
+
+### Program Purpose ###
+Edge Secured-core is an incremental certification in the Azure Certified Device program for IoT devices running a full operating system, such as Linux or Windows 10 IoT.This program enables device partners to differentiate their devices by meeting an additional set of security criteria. Devices meeting this criteria enable these promises:
+1. Hardware-based device identity
+2. Capable of enforcing system integrity
+3. Stays up to date and is remotely manageable
+4. Provides data at-rest protection
+5. Provides data in-transit protection
+6. Built in security agent and hardening
+### Requirements ###
++
+|Name|SecuredCore.Built-in.Security|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to make sure devices can report security information and events by sending data to Azure Defender for IoT.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation |Device must generate security logs and alerts. Device logs and alerts messages to Azure Security Center.<ol><li>Download and deploy security agent from GitHub</li><li>Validate alert message from Azure Defender for IoT.</li></ol>|
+|Resources|[Azure Docs IoT Defender for IoT](../defender-for-iot/how-to-configure-agent-based-solution.md)|
++
+|Name|SecuredCore.Encryption.Storage|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test to validate that sensitive data can be encrypted on non-volitile storage.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure storage encryption is enabled and default algorithm is XTS-AES, with key length 128 bits or higher.|
+|Resources||
++
+|Name|SecuredCore.Hardware.SecureEnclave|
+|:|:|
+|Status|Optional|
+|Description|The purpose of the test to validate the existence of a secure enclave and that the enclave is accessible from a secure agent.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure the Azure Security Agent can communicate with the secure enclave|
+|Resources|https://github.com/openenclave/openenclave/blob/master/samples/BuildSamplesLinux.md|
++
+|Name|SecuredCore.Hardware.Identity|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate the device identify is rooted in hardware.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that the device has a TPM present and that it can be provisioned through IoT Hub using TPM endorsement key.|
+|Resources|[Setup auto provisioning with DPS](../iot-dps/quick-setup-auto-provision.md)|
++
+|Name|SecuredCore.Update|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate the device can receive and update its firmware and software.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Partner confirmation that they were able to send an update to the device through Microsoft update, Azure Device update, or other approved services.|
+|Resources|[Device Update for IoT Hub](../iot-hub-device-update/index.yml)|
++
+|Name|SecuredCore.Manageability.Configuration|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate the devices support remote security management.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure the device supports the ability to be remotely manageable and specifically security configurations. And the status is reported back to IoT Hub/Azure Defender for IoT.|
+|Resources||
++
+|Name|SecuredCore.Manageability.Reset|
+|:|:|
+|Status|Required|
+|Description|The purpose of this test is to validate the device against two use cases: a) Ability to perform a reset (remove user data, remove user configs), b) Restore device to last known good in the case of an update causing issues.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through a combination of toolset and submitted documentation that the device supports this functionality. The device manufacturer can determine whether to implement these capabilities to support remote reset or only local reset.|
+|Resources||
++
+|Name|SecuredCore.Updates.Duration|
+|:|:|
+|Status|Required|
+|Description|The purpose of this policy is to ensure that the device remains secure.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual|
+|Validation|Commitment from submission that devices certified will be required to keep devices up to date for 60 months from date of submission. Specifications available to the purchaser and devices itself in some manner should indicate the duration for which their software will be updated.|
+|Resources||
++
+|Name|SecuredCore.Policy.Vuln.Disclosure|
+|:|:|
+|Status|Required|
+|Description|The purpose of this policy is to ensure that there is a mechanism for collecting and distributing reports of vulnerabilities in the product.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual|
+|Validation|Documentation on the process for submitting and receiving vulnerability reports for the certified devices will be reviewed.|
+|Resources||
++
+|Name|SecuredCore.Policy.Vuln.Fixes|
+|:|:|
+|Status|Required|
+|Description|The purpose of this policy is to ensure that vulnerabilities that are high/critical (using CVSS 3.0) are addressed within 180 days of the fix being available.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual|
+|Validation|Documentation on the process for submitting and receiving vulnerability reports for the certified devices will be reviewed.|
+|Resources||
+++
+|Name|SecuredCore.Encryption.TLS|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate support for required TLS versions and cipher suites.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+Validation|Device to be validated through toolset to ensure the device supports a minimum TLS version of 1.2 and supports the following required TLS cipher suites.<ul><li>TLS_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_RSA_WITH_AES_128_CBC_SHA256</li><li>TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256</li><li>TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_DHE_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256</li><li>TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256</li></ul>|
+|Resources| [TLS support in IoT Hub](../iot-hub/iot-hub-tls-support.md) <br /> [TLS Cipher suites in Windows 10](https://docs.microsoft.com/windows/win32/secauthn/tls-cipher-suites-in-windows-10-v1903) |
++
+|Name|SecuredCore.Protection.SignedUpdates|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate that updates must be signed.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that updates to the operating system, drivers, application software, libraries, packages and firmware will not be applied unless properly signed and validated.
+|Resources||
++
+|Name|SecuredCore.Firmware.SecureBoot|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate the boot integrity of the device.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that firmware and kernel signatures are validated every time the device boots. <ul><li>UEFI: Secure boot is enabled</li><li>Uboot: Verified boot is enabled</li></ul>|
+|Resources||
++
+|Name|SecuredCore.Protection.CodeIntegrity|
+|:|:|
+|Status|Required|
+|Description|The purpose of this test is to validate that code integrity is available on this device.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that code integrity is enabled. </br> Windows: HVCI </br> Linux: dm-verity and IMA|
+|Resources||
++
+|Name|SecuredCore.Protection.NetworkServices|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate that applications accepting input from the network are not running with elevated privileges.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that services accepting network connections are not running with SYSTEM or root privileges.|
+|Resources||
++
+|Name|SecuredCore.Protection.Baselines|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate that the system conforms to a baseline security configuration.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that Defender IOT system configurations benchmarks have been run.|
+|Resources| https://techcommunity.microsoft.com/t5/microsoft-security-baselines/bg-p/Microsoft-Security-Baselines <br> https://www.cisecurity.org/cis-benchmarks/ |
++
+|Name|SecuredCore.Firmware.Protection|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to ensure that device has adequate mitigations from Firmware security threats.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to confirm it is protected from firmware security threats through one of the following approaches: <ul><li>DRTM + UEFI Management Mode mitigations</li><li>DRTM + UEFI Management Mode hardening</li><li>Approved FW that does SRTM + runtime firmware hardening</li></ul> |
+|Resources| https://trustedcomputinggroup.org/ |
++
+|Name|SecuredCore.Firmware.Attestation|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to ensure the device can remotely attest to the Microsoft Azure Attestation service.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that platform boot logs and measurements of boot activity can be collected and remotely attested to the Microsoft Azure Attestation service.|
+|Resources| [Microsoft Azure Attestation](../attestation/index.yml) |
++
+|Name|SecuredCore.Hardware.MemoryProtection|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate that DMA is not enabled on externally accessible ports.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|If DMA capable external ports exist on the device, toolset to validate that the IOMMU or SMMU is enabled and configured for those ports.|
+|Resources||
++
+|Name|SecuredCore.Protection.Debug|
+|:|:|
+|Status|Required|
+|Description|The purpose of the test is to validate that debug functionality on the device is disabled.|
+|Target Availability|2021|
+|Applies To|Any device|
+|OS|Agnostic|
+|Validation Type|Manual/Tools|
+|Validation|Device to be validated through toolset to ensure that debug functionality requires authorization to enable.|
+|Resources||
certification Program Requirements Pnp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/program-requirements-pnp.md
+
+ Title: IoT Plug and Play Certification Requirements
+description: IoT Plug and Play Certification program requirements
+++ Last updated : 03/15/2021++++
+# IoT Plug and Play Certification Requirements
+
+This document outlines the device specific capabilities that will be represented in the Azure IoT Device catalog. A capability is singular device attribute that may be software implementation or combination of software and hardware implementations.
+
+## Program Purpose
+
+IoT Plug and Play Preview enables solution builders to integrate smart devices with their solutions without any manual configuration. At the core of IoT Plug and Play, is a device model that a device uses to advertise its capabilities to an IoT Plug and Play-enabled application. This model is structured as a set of elements: Telemetry, Properties and Commands.
+
+Promise of IoT Plug and Play certification are:
+
+1. Defined device models and interfaces are compliant with the [Digital Twin Definition Language](https://github.com/Azure/opendigitaltwins-dtdl)
+2. Secure provisioning and easy transfer of ID scope ownership in Device Provisioning Services
+3. Easy integration with Azure IoT based solutions using the [Digital Twin APIs](../iot-pnp/concepts-digital-twin.md) : Azure IoT Hub and Azure IoT Central
+4. Validated product truth on certified devices
+
+## Requirements
+
+**[Required] Device to cloud: The purpose of test is to make sure devices that send telemetry works with IoT Hub**
+
+| **Name** | IoTPnP.D2C |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must send any telemetry schemas to IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests. Device to cloud (required): **1.** Validates that the device can send message to AICS managed IoT Hub **2.** User must specify the number and frequency of messages. **3.** AICS validates the telemetry is received by the Hub instance |
+| **Resources** | [Certification steps](./overview.md) (has all the additional resources) |
+
+**[Required] DPS: The purpose of test is to check the device implements and supports IoT Hub Device Provisioning Service with one of the three attestation methods**
+
+| **Name** | IoTPnP.DPS |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must implement easy transfer of DPS ID Scope ownership without needing to recompile the embedded code. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests to validate that the device supports DPS **1.** User must select one of the attestation methods (X.509, TPM and SAS key) **2.** Depending on the attestation method, user needs to take corresponding action such as **a)** Upload X.509 cert to AICS managed DPS scope **b)** Implement SAS key or endorsement key into the device |
+| **Resources** | **a)** [Device provisioning service overview](../iot-dps/about-iot-dps.md), **b)** [Sample config file for DPS ID Scope transfer](https://github.com/Azure/azure-iot-sdk-c/tree/public-preview-pnp/digitaltwin_client/samples/digitaltwin_sample_ll_device/sample_config) |
+
+**[Required] DTDL v2: The purpose of test to ensure defined device models and interfaces are compliant with the Digital Twins Definition Language v2.**
+
+| **Name** | IoTPnP.DTDL |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | The [portal workflow](https://certify.azure.com) validates: **1.** Model ID announcement and ensure the device is connected using either the MQTT or MQTT over WebSockets protocol **2.** Models are compliant with the DTDL v2 **3.** Telemetry, properties, and commands are properly implemented and interact between IoT Hub Digital Twin and Device Twin on the device |
+| **Resources** | [Public Preview Refresh updates](../iot-pnp/overview-iot-plug-and-play-preview-updates.md) |
+
+**[Required] Device models are published in public model repository**
+
+| **Name** | IoTPnP.ModelRepo |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | All device models are required to be published in public repository. Device models are resolved via models available in public repository **1.** User must manually publish the models to the public repository before submitting for the certification. **2.** Note that once the models are published, it is immutable. We strongly recommend publishing only when the models and embedded device code are finalized.*1 *1 User must contact Microsoft support to revoke the models once published to the model repository **3.** [Portal workflow](https://certify.azure.com) checks the existence of the models in the public repository when the device is connected to the certification service |
+| **Resources** | [Model repository](../iot-pnp/overview-iot-plug-and-play-preview-updates.md) |
+
+**[Required] Physical device validation using the GSG**
+
+| **Name** | IoTPnP.Physicaldevice |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Manual |
+| **Validation** | Partners must engage with Microsoft contact ([iotcert@microsoft.com](mailto:iotcert@microsoft.com)) to make arrangements to perform additional validations on physical device. Due to COVID-19 situation, we are exploring various ways to perform physical device validation without shipping the device to Microsoft. |
+| **Resources** | Details are available later |
+| **Azure Recommended** | N/A |
+
+**[If implemented] Device info Interface: The purpose of test is to validate device info interface is implemented properly in the device code**
+
+| **Name** | IoTPnP.DeviceInfoInterface |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Any device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | [Portal workflow](https://certify.azure.com) validates the device code implements [device info interface](https://repo.azureiotrepository.com/Models/dtmi:azure:DeviceManagement:DeviceInformation;1?api-version=2020-05-01-previewureiot:DeviceManagement:DeviceInformation:1) **1.** Checks the values are emitted by the device code to IoT Hub **2.** Checks the interface is implemented in the DCM (this implementation will change in DTDL v2) **3.** Checks properties are not write-able (read only) **4.** Checks the schema type is string and/or long and not null |
+| **Resources** | [Microsoft defined interface](../iot-pnp/overview-iot-plug-and-play-preview-updates.md) |
+| **Azure Recommended** | N/A |
+
+**[If implemented] Cloud to device: The purpose of test is to make sure messages can be sent from cloud to devices**
+
+| **Name** | IoTPnP.C2D |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must be able to Cloud to Device messages from IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute these tests. Cloud to device (if implemented): **1.** Validates that the device can receive message from IoT Hub **2.** AICS sends random message and validates via message ACK from the device |
+| **Resources** | **1.** [Certification steps](./overview.md) (has all the additional resources), **2.** [Send cloud to device messages from an IoT Hub](../iot-hub/iot-hub-devguide-messages-c2d.md) |
+
+**[If implemented] Direct methods: The purpose of test is to make sure devices works with IoT Hub and supports direct methods**
+
+| **Name** | IoTPnP.DirectMethods |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must be able to receive and reply commands requests from IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests. Direct methods (if implemented): **1.** User has to specify the method payload of direct method. **2.** AICS validates the specified payload request is sent from Hub and ACK message received by the device |
+| **Resources** | **1.** [Certification steps](./overview.md) (has all the additional resources), **2.** [Understand direct methods from IoT Hub](../iot-hub/iot-hub-devguide-direct-methods.md) |
+
+**[If implemented] Device twin property: The purpose of test is to make sure devices that send telemetry works with IoT Hub and supports some of the IoT Hub capabilities such as direct methods, and device twin property**
+
+| **Name** | IoTPnP.DeviceTwin |
+| -- | |
+| **Target Availability** | Available now |
+| **Applies To** | Leaf device/Edge device |
+| **OS** | Agnostic |
+| **Validation Type** | Automated |
+| **Validation** | Device must send any telemetry schemas to IoT Hub. Microsoft provides the [portal workflow](https://certify.azure.com) to execute the tests. Device twin property (if implemented): **1.** AICS validates the read/write-able property in device twin JSON **2.** User has to specify the JSON payload to be changed **3.** AICS validates the specified desired properties sent from IoT Hub and ACK message received by the device |
+| **Resources** | **1.** [Certification steps](./overview.md) (has all the additional resources), **2.** [Use device twins with IoT Hub](../iot-hub/iot-hub-devguide-device-twins.md) |
certification Resources Glossary https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/resources-glossary.md
+
+ Title: Azure Certified Device program glossary
+description: A list of common terms used in the Azure Certified Device program
++++ Last updated : 03/03/2021+++
+# Azure Certified Device program glossary
+
+This guide provides definitions of terms commonly used in the Azure Certified Device program and portal. Refer to this glossary for clarification to the certification process. For your convenience, this glossary is categorized based on major certification concepts that you may have questions about.
+
+## Device class
+
+When creating your certification project, you will be asked to specify a device class. Device class refers to the form factor or classification that best represents your device.
+
+- **Gateway**
+
+ A device that processes data sent over an IoT network.
+
+- **Sensor**
+
+ A device that detects and responds to changes to an environment and connects to gateways to process the changes.
+
+- **Other**
+
+ If you select Other, add a description of your device class in your own words. Over time, we may continue to add new values to this list, particularly as we continue to monitor feedback from our partners.
+
+## Device type
+
+You will also be asked to select one of two device types during the certification process.
+
+- **Finished Product**
+
+ A device that is solution-ready and ready for production deployment. Typically in a finished form factor with firmware and an operating system. These may be general-purpose devices that require additional customization or specialized devices that require no modifications for usage.
+- **Solution-Ready Dev Kit**
+
+ A development kit containing hardware and software ideal for easy prototyping, typically not in a finished form factor. Usually includes sample code and tutorials to enable quick prototyping.
+
+## Component type
+
+In the Device details section, you'll describe your device by listing components by component type. You can view more guidance on components [here](./how-to-using-the-components-feature.md).
+
+- **Customer Ready Product**
+
+ A component representation of the overall or primary device. This is different from a **Finished Product**, which is a classification of the device as being ready for customer use without further development. A Finished Product will contain a Customer Ready Product component.
+- **Development Board**
+
+ Either an integrated or detachable board with microprocessor for easy customization.
+- **Peripheral**
+
+ Either an integrated or detachable addition to the product (such as an accessory). These are typically devices that connect to the main device, but does not contribute to device primary functions. Instead, it provides additional functions. Memory, RAM, storage, hard disks, and CPUs are not considered peripheral devices (they instead should be listed under Additional Specs of the Customer Ready Product component).
+- **System-On-Module**
+
+ A board-level circuit that integrates a system function in a single module.
+
+## Component attachment method
+
+Component attachment method is another component detail that informs the customer about how the component is integrated into the overall product.
+
+- **Integrated**
+
+ Refers to when a device component is a part of the main chassis of the product. This most commonly refers to a peripheral component type that cannot be removed from the device.
+ Example: An integrated temperature sensor inside a gateway chassis.
+
+- **Discrete**
+
+ Refers to when a component is **not** a part of main chassis of the product.
+ Example: An external temperature sensor that must be attached to the device.
++
+## Next steps
+
+This glossary will guide you through the process of certifying your project on the portal. You're now ready to begin your project!
+- [Tutorial: Creating your project](./tutorial-01-creating-your-project.md)
certification Tutorial 00 Selecting Your Certification https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/tutorial-00-selecting-your-certification.md
+
+ Title: Azure Certified Device program - Tutorial - Selecting your certification program
+description: Step-by-step guide to selecting the right certification programs for your device
++++ Last updated : 03/19/2021+++
+# Tutorial: Select your certification program
+
+Congratulations on choosing the Azure Certified Device program! We're excited to have you join our ecosystem of certified devices. To begin, you must first determine which certification programs best suit your device capabilities.
+
+In this tutorial, you learn to:
+
+> [!div class="checklist"]
+> * Select the best certification program(s) for your device
+
+## Selecting a certification program for your device
+
+All devices are required to meet the baseline requirements outlined by the [**Azure Certified Device**](./program-requirements-azure-certified-device.md) certification. The other three certification badges build on this program and deliver a different customer value. You can select one or more of the three incremental badges (IoT Plug and Play, Edge Managed, and Edge Secured Core *preview).
+
+1. Review each of the certification programs' requirements in the table below. Detailed requirements for each program can be seleted from the headers.
+
+ |Requirement|[IoT Plug and Play](./program-requirements-edge-secured-core.md)|[Edge Managed](./program-requirements-edge-managed.md)|[Edge Secured-core](./program-requirements-edge-secured-core.md)|
+ |||
+ | Processor | Any|MPU/CPU|MPU/CPU|
+ | OS | Any|[Tier 1 OS](../iot-edge/support.md?view=iotedge-2018-06&preserve-view=true))|[Tier 1 OS](../iot-edge/support.md?view=iotedge-2018-06&preserve-view=true)|
+ | IoT Edge runtime | Not supported |Required|Required|
+ | Defender for IoT | Not supported|Required|Required|
+ | ADU/Windows Update | Not supported|Required|Required|
+
+1. Once you feel that you understand what is required of your device, review teh technical requirements for the program. This may be the Azure Certified Device certification, or a combination of the baseline certification with one of the three incremental badges.
+
+## Next steps
+
+You're now ready to begin certifying your device! Advance to the next article to begin your project.
+> [!div class="nextstepaction"]
+>[Tutorial: Creating your project](tutorial-01-creating-your-project.md)
certification Tutorial 01 Creating Your Project https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/tutorial-01-creating-your-project.md
+
+ Title: Azure Certified Device program - Tutorial - Creating your project
+description: Guide to create a project on the Azure Certified Device portal
++++ Last updated : 03/01/2021+++
+# Tutorial: Create your project
+
+Congratulations on choosing to certify your device through the Azure Certified Device program! You've now selected the appropriate certification program for your device, and are ready to get started on the portal.
+
+In this tutorial, you will learn how to:
+
+> [!div class="checklist"]
+> * Sign into the [Azure Certified Device portal](https://certify.azure.com/)
+> * Create a new certification project for your device
+> * Specify basic device details of your project
+
+## Prerequisites
+
+- You'll need a valid work/school [Azure Active Directory account](https://docs.microsoft.com/azure/active-directory/fundamentals/active-directory-whatis).
+- You'll need a verified Microsoft Partner Network (MPN) account. If you don't have an MPN account, [join the partner network](https://partner.microsoft.com/) before you begin.
+
+## Signing into the Azure Certified Device portal
+
+To get started, you must sign in to the portal, where you'll be providing your device information, completing certification testing, and managing your device publications to the Azure Certified Device catalog.
+
+1. Go to the [Azure Certified Device portal](https://certify.azure.com).
+1. Select `Company profile` on the left-hand side and update your manufacturer information.
+ ![Company profile section](./media/images/company-profile.png)
+1. Accept the program agreement to begin your project.
+
+## Creating your project on the portal
+
+Now that you're all set up in the portal, you can begin the certification process. First, you must create a project for your device.
+
+1. On the home screen, select `Create new project`. This will open a window to add basic device information in the next section.
+
+ ![Image of the Create new project button](./media/images/create-new-project.png)
+
+## Identifying basic device information
+
+Then, you must supply basic device information. You can to edit this information later.
+
+1. Complete the fields requested under the `Basics` section. Refer to the table below for clarification regarding the **required** fields:
+
+ | Fields | Description |
+ ||-|
+ | Project name | Internal name that will not be visible on the Azure Certified Device catalog |
+ | Device name | Public name for your device |
+ | Device type | Specification of Finished Product or Solution-Ready Developer Kit. For more information about the terminology, see [Certification glossary](./resources-glossary.md). |
+ | Device class | Gateway, Sensor, or other. For more information about the terminology, see [Certification glossary](./resources-glossary.md). |
+ | Device source code URL | Required if you are certifying a Solution-Ready Dev Kit, optional otherwise. URL must be to a GitHub location for your device code. |
+1. Select the `Next` button to continue to the `Certifications` tab.
+
+ ![Image of the Create new project form, Certifications tab](./media/images/create-new-project-certificationswindow.png)
+
+1. Specify which certification(s) you wish to achieve for your device.
+1. Select `Create` and the new project will be saved and visible in the home page of the portal.
+
+ ![Image of project table](./media/images/project-table.png)
+
+1. Select on the Project name in the table. This will launch the project summary page where you can add and view other details about your device.
+
+ ![Image of the project details page](./media/images/device-details-section.png)
+
+## Next steps
+
+You are now ready to add device details and test your device using our certification service. Advance to the next article to learn how to edit your device details.
+> [!div class="nextstepaction"]
+> [Tutorial: Adding device details](tutorial-02-adding-device-details.md)
certification Tutorial 02 Adding Device Details https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/tutorial-02-adding-device-details.md
+
+ Title: Azure Certified Device program - Tutorial - Adding device details
+description: A step-by-step guide to add device details to your project on the Azure Certified Device portal
++++ Last updated : 03/02/2021+++
+# Tutorial: Add device details
+
+Now you've created your project for your device, and you're all set to begin the certification process! First, let's add your device details. These will include technical specifications that your customers will be able to view on the Azure Certified Device catalog and the marketing details that they will use to purchase once they've made a decision.
+
+In this tutorial, you learn how to:
+
+> [!div class="checklist"]
+> * Add device details using the Components and Dependencies features
+> * Upload a Get Started guide for your device
+> * Specify marketing details for customers to purchase your device
+> * Optionally identify any industry certifications
+
+## Prerequisites
+
+* You should be signed in and have a project for your device created on the [Azure Certified Device portal](https://certify.azure.com). For more information, view the [tutorial](tutorial-01-creating-your-project.md).
+* You should have a Get Started guide for your device in PDF format. We provide a number of Get Started templates for you to use, depending on both the certification program and your preferred language. The templates are available at our [Get started templates](https://aka.ms/GSTemplate "Get started templates") GitHub location.
+
+## Adding technical device details
+
+The first section of your project page, called 'Input device details', allows you to provide information on the core hardware capabilities of your device, such as device name, description, processor, operating system, connectivity options, hardware interfaces, industry protocols, physical dimensions, and more. While many of the fields are optional, most of this information will be made available to potential customers on the Azure Certified Device catalog if you choose to publish your device after it has been certified.
+
+1. Click `Add` in the 'Input device details' section on your project summary page to open the device details section. You will see five sections for you to complete.
+
+![Image of the project details page](./media/images/device-details-menu.png)
+
+2. Review the information you previously provided when you created the project under the `Basics` tab.
+1. Review the certifications you are applying for with your device under the `Certifications` tab.
+1. Open the `Product details` tab and select at least one operating system.
+1. Add **at least** one discrete component that describes your device. You can view additional guidance on component usage [here](how-to-using-the-components-feature.md).
+1. Click `Save`. You will then be able to edit your component device and add more advanced details.
+1. List additional device details not captured by the component details under `Additional product details`.
+1. If you marked `Other` in any of the component fields or have a special circumstance you would like to flag with the Azure Certification team, leave a clarifying comment in the `Comments for reviewer` section.
+1. Use the `Dependencies` tab to list any dependencies if your device requires additional hardware or services to send data to Azure. You can view additional guidance on listing dependencies [here](how-to-indirectly-connected-devices.md).
+1. Once you are satisfied with the information you've provided, you can use the `Review` tab for a read-only overview of the full set of device details that been entered.
+1. Click `Project summary` at the top of the page to return to your summary page.
+
+![Review project details page](./media/images/sample-device-details.png)
+
+## Uploading a Get Started guide
+
+The Get Started guide is a PDF document to simplify the setup and configuration and management of your product. Its purpose is to make it simple for customers to connect and support devices on Azure using your device. As part of the certification process, we require our partners to provide **one** Get Started guide for their most relevant certification program.
+
+1. Double-check that you have provided all requested information in your Get Started guide PDF according to the supplied [templates](https://aka.ms/GSTemplate). The template that you use should be determined by the certification badge you are applying for. (For example, an IoT Plug and Play device will use the IoT Plug and Play template. Devices applying for *only* the Azure Certified Device baseline certification will use the Azure Certified Device template.)
+1. Click `Add` in the 'Get Started' guide section of the project summary page.
+
+![Image of GSG button](./media/images/gsg-menu.png)
+
+2. Click 'Choose File' to upload your PDF.
+1. Review the document in the preview for formatting.
+1. Save your upload by clicking the 'Save' button.
+1. Click `Project summary` at the top of the page to return to your summary page.
+
+## Providing marketing details
+
+In this area, you will provide customer-ready marketing information for your device. These fields will be showcased on the Azure Certified Device catalog if you choose to publish your certified device.
+
+1. Click `Add` in the 'Add marketing details' section to open the marketing details page.
+
+![Image of marketing details section](./media/images/marketing-details.png)
+
+1. Upload a product photo in JPEG or PNG format that will be used in the catalog.
+1. Write a short description of your device that will be displayed on the product description page of the catalog.
+1. Indicate geographic availability of your device.
+1. Provide a link to the manufacturer's marketing page for this device. This should be a link to a site that provides additional information about the device.
+ > [!Note]
+ > Please ensure all supplied URLs are valid or will be active at the time of publication following approval.*)
+
+1. Indicate up to 3 target industries that your device is optimized for.
+1. Provide information for up to 5 distributors of your device. This may include the manufacturer's own site.
+
+ > [!Note]
+ > If no distributor product page URL is supplied, then the `Shop` button on the catalog will default to the link supplied for `Distributor page`, which may not be specific to the device. Ideally, the distributor URL should lead to a specific page where a customer can purchase a device, but is not mandatory. If the distributor is the same as the manufacturer, this URL may be the same as the manufacturer's marketing page.*)
+
+1. Click `Save` to confirm your information.
+1. Click `Project summary` at the top of the page to return to your summary page.
+
+## Declaring additional industry certifications
+
+You can also promote additional industry certifications you may have received for your device. These certifications can help provide further clarity on the intended use of your device and will be searchable on the Azure Certified Device catalog.
+
+1. Click `Add` in the 'Provide industry certifications' section.
+1. Click `Add a certification`to select from a list of the common industry certification programs. If your product has achieved a certification not in our list, you can specify a custom string value by selecting `Other (please specify)`.
+1. Optionally provide a description or notes to the reviewer. However, these notes will not be publicly available to view on the catalog.
+1. Click `Save` to confirm your information.
+1. Click `Project summary` at the top of the page to return to your summary page.
+
+## Next steps
+
+Now you have completed the process of describing your device! This will help the Azure Certified Device review team and your customer better understand your product. Once you are satisfied with the information you've provided, you are now ready to move on to the testing phase of the certification process.
+> [!div class="nextstepaction"]
+> [Tutorial: Testing your device](tutorial-03-testing-your-device.md)
certification Tutorial 03 Testing Your Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/tutorial-03-testing-your-device.md
+
+ Title: Azure Certified Device program - Tutorial - Testing your device
+description: A step-by-step guide to test you device with AICS service on the Azure Certified Device portal
++++ Last updated : 03/02/2021+++
+# Tutorial: Test and submit your device
+
+The next major phase of the certification process (though it can be completed before adding your device details) involves testing your device. Through the portal, you'll use the Azure IoT Certification Service (AICS) to demonstrate your device performance according to our certification requirements. Once you've successfully passed the testing phase, you'll then submit your device for final review and approval by the Azure Certification team!
+
+In this tutorial, you learn how to:
+
+> [!div class="checklist"]
+> * Connect your device to IoT Hub using Device Provisioning Service (DPS)
+> * Test your device according to your selected certification program(s)
+> * Submit your device for review by the Azure Certification team
+
+## Prerequisites
+
+- You should be signed in and have a project for your device created on the [Azure Certified Device portal](https://certify.azure.com). For more information, view the [tutorial](tutorial-01-creating-your-project.md).
+- (Optional) We advise that you prepare your device and manually verify their performance according to certification requirements. This is because if you wish to re-test with different device code or certification program, you will have to create a new project.
+
+## Connecting your device using DPS
+
+All certified devices are required to demonstrate the ability to connect to IoT Hub using DPS. The following steps walk you through how to successfully connect your device for testing on the portal.
+
+1. To begin the testing phase, select the `Connect & test` link on the project summary page:
+
+ ![Connect and test link](./media/images/connect-and-test-link.png)
+
+1. Depending on the certification(s) selected, you'll see the required tests on the 'Connect & test' page. Review these to ensure that you're applying for the correct certification program.
+
+ ![Connect and test page](./media/images/connect-and-test.png)
+
+1. Connect your device to IoT Hub using the Device Provisioning Service (DPS). DPS supports connectivity options of Symmetric keys, X.509 certification, and a Trusted Platform Module (TPM). This is required for all certifications.
+
+ - *For more information on connecting your device to Azure IoT Hub with DPS, visit [Provisioning devices overview](../iot-dps/about-iot-dps.md "Device Provisioning Service overview").*
+
+1. If using symmetric keys, you'll then be asked to configure the DPS with the supplied DPS ID scope, Device ID, authentication key, and DPS endpoint. Otherwise, you will be asked to provide either X.509 certificate or endorsement key.
+
+1. After configuring your device with DPS, confirm the connection by clicking the `Connect` button at the bottom of the page. Upon successful connection, you can proceed to the testing phase by clicking the `Next` button.
+
+ ![Connect and Test connected](./media/images/connected.png)
+
+## Testing your device
+
+Once you have successfully connected your device to AICS, you are now ready to run the certification tests specific to the certification program you are applying for.
+
+1. **For Azure Certified Device certification**: In the 'Select device capability' tab, you will review and select which tests you wish to run on your device.
+1. **For IoT Plug and Play certification**: Carefully review the parameters that will be checked during the test that you declared in your device model.
+1. **For Edge Managed certification**: No additional steps are required beyond demonstrating connectivity.
+1. Once you have completed the necessary preparations for the specified certification program, select `Next` to proceed to the 'Test' phase.
+1. Select `Run tests` on the page to begin running AICS with your device.
+1. Once you have received a notification that you have passed the tests, select `Finish` to return to your summary page.
+
+![Test passed](./media/images/test-pass.png)
+
+7. If you have additional questions or need troubleshooting assistance with AICS, visit our troubleshooting guide.
+
+> [!NOTE]
+> While you will be able to complete the online certification process for IoT Plug and Play and Edge Managed without having to submit your device for manual review, you may be contacted by a Azure Certified Device team member for further device validation beyond what is tested through our automation service.
+
+## Submitting your device for review
+
+Once you have completed all of the mandatory fields in the 'Device details' section and successfully passed the automated testing in the 'Connect & test' process, you can now notify the Azure Certified Device team that you are ready for certification review.
+
+1. select `Submit for review` on the project summary page:
+
+ ![Review and Certify link](./media/images/review-and-certify.png)
+
+1. Confirm your submission in the pop-up window. Once a device has been submitted, all device details will be read-only until editing is requested. (See [How to edit your device information after publishing](./how-to-edit-published-device.md).)
+
+ ![Start Certification review dialog](./media/images/start-certification-review.png)
+
+1. Once the project is submitted, the project summary page will indicate the project is `Under Certification Review` by the Azure Certification team:
+
+ ![Under Review](./media/images/review-and-certify-under-review.png)
+
+1. Within 5-7 business days, expect an email response from the Azure Certification team to the address provided in your company profile regarding the status of your device submission.
+
+ - Approved submission
+ Once your project has been reviewed and approved, you will receive an email. The email will include a set of files including the Azure Certified Device badge, badge usage guidelines, and other information on how to amplify the message that your device is certified. Congratulations!
+
+ - Pending submission
+ In the case your project is not approved, you will be able to make changes to the project details and then resubmit the device for certification once ready. An email will be sent with information on why the project was not approved and steps to resubmit for certification.
+
+## Next steps
+
+Congratulations! Your device has now successfully passed all of the tests and has been approved through the Azure Certified Device program. You can now publish your device to our Azure Certified Device catalog, where customers can shop for your products with confidence in their performance with Azure.
+> [!div class="nextstepaction"]
+> [Tutorial: Publishing your device](tutorial-04-publishing-your-device.md)
+
certification Tutorial 04 Publishing Your Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/certification/tutorial-04-publishing-your-device.md
+
+ Title: Azure Certified Device program - Tutorial - Publishing your device
+description: A step-by-step guide to publish your certified device to the Azure Certified Device catalog
++++ Last updated : 03/03/2021+++
+# Tutorial: Publish your device
+
+Congratulations on successfully certifying your device! Your product is joining an ecosystem of exceptional devices that work great with Azure. Now that your device has been certified, you can optionally publish your device details to the [Azure Certified Device catalog](https://devicecatalog.azure.com) for a world of customers to discover and buy.
+
+In this tutorial, you learn how to:
+
+> [!div class="checklist"]
+> * Publish your device to the Azure Certified Device catalog
+
+## Prerequisites
+
+- You should be signed in and have an **approved** project for your device on the [Azure Certified Device portal](https://certify.azure.com). If you don't have a certified device, you can view this [tutorial](tutorial-01-creating-your-project.md) to get started.
+
+## Publishing your device
+
+Publishing your device is a simple process that will help bring customers to your product from the Azure Certified Device catalog.
+
+1. To publish your device, click `Publish to Device Catalog` on the project summary page.
+
+ ![Publish to Catalog](./media/images/publish-to-catalog.png)
+
+1. Confirm the publication in the pop-up window
+
+ ![Publish to Catalog confirmation](./media/images/publish-to-catalog-confirm.png)
+
+1. You will receive notification to the email address in your company profile once the device has been processed the Azure Certified Device catalog.
+
+## Next steps
+
+Congratulations! Your certified device is now a part of the Azure Certified Device catalog, where customers can shop for your products with confidence in their performance with Azure! Thank you for being part of our ecosystem of certified IoT products. You will notice that your project page is now read-only. If you wish to make any updates to your device information, see our how-to guide.
+> [!div class="nextstepaction"]
+> [How to edit your published device](how-to-edit-published-device.md)
+
cognitive-services Ecommerce Retail Catalog Moderation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Content-Moderator/ecommerce-retail-catalog-moderation.md
Last updated 01/29/2021
-#As a developer at an e-commerce company, I want to use machine learning to both categorize product images and tag objectionable images for further review by my team.
+#Customer intent: As a developer at an e-commerce company, I want to use machine learning to both categorize product images and tag objectionable images for further review by my team.
# Tutorial: Moderate e-commerce product images with Azure Content Moderator
cognitive-services Image Lists Quickstart Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Content-Moderator/image-lists-quickstart-dotnet.md
Last updated 10/24/2019
-#As a C# developer of content-providing software, I want to check images against a custom list of inappropriate images so that I can handle them more efficiently.
+#Customer intent: As a C# developer of content-providing software, I want to check images against a custom list of inappropriate images so that I can handle them more efficiently.
# Moderate with custom image lists in C#
cognitive-services Term Lists Quickstart Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Content-Moderator/term-lists-quickstart-dotnet.md
Last updated 10/24/2019
-#As a C# developer of content-providing software, I want to analyze text content for terms that are particular to my product, so that I can categorize and handle it accordingly.
+#Customer intent: As a C# developer of content-providing software, I want to analyze text content for terms that are particular to my product, so that I can categorize and handle it accordingly.
# Check text against a custom term list in C#
cognitive-services Video Moderation Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Content-Moderator/video-moderation-api.md
Last updated 05/18/2020
-#As a C# developer of content management software, I want to analyze video content for offensive or inappropriate material so that I can categorize and handle it accordingly.
+#Customer intent: As a C# developer of content management software, I want to analyze video content for offensive or inappropriate material so that I can categorize and handle it accordingly.
# Analyze video content for objectionable material in C#
cognitive-services Custom Vision Onnx Windows Ml https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Custom-Vision-Service/custom-vision-onnx-windows-ml.md
Last updated 04/29/2020
-# As a developer, I want to use a custom vision model with Windows ML.
+#Customer intent: As a developer, I want to use a custom vision model with Windows ML.
# Use an ONNX model from Custom Vision with Windows ML (preview)
cognitive-services Speech Synthesis Markup https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/speech-synthesis-markup.md
Currently, speaking style adjustments are supported for these neural voices:
* `zh-CN-XiaoxuanNeural` (Preview) * `zh-CN-XiaoruiNeural` (Preview)
-The intensity of speaking style can be further changed to better fit your use case. You can specify a stronger or softer style with `styledegree` to make the speech more expressive or subdued.
+The intensity of speaking style can be further changed to better fit your use case. You can specify a stronger or softer style with `styledegree` to make the speech more expressive or subdued. Currently, speaking style adjustments are supported for Chinese (Mandarin, Simplified) neural voices.
-Currently, speaking style adjustments are supported for these neural voices:
-* `zh-CN-XiaoxiaoNeural`
-
-Apart from adjusting the speaking styles and style degree, you can also adjust the `role` parameter so that the voice will imitate a different age and gender. For example, a male voice can raise the pitch and change the intonation to imitate a female voice.
-
-Currently, role-play adjustments are supported for these neural voices:
+Apart from adjusting the speaking styles and style degree, you can also adjust the `role` parameter so that the voice will imitate a different age and gender. For example, a male voice can raise the pitch and change the intonation to imitate a female voice, but the voice name will not be changed. Currently, role-play adjustments are supported for these Chinese (Mandarin, Simplified) neural voices:
* `zh-CN-XiaomoNeural` * `zh-CN-XiaoxuanNeural`
Above changes are applied at the sentence level, and styles and role-plays vary
<mstts:express-as role="string" style="string"></mstts:express-as> ``` > [!NOTE]
-> At the moment, `styledegree` only supports zh-CN-XiaoxiaoNeural. `role` only supports zh-CN-XiaomoNeural and zh-CN-XiaoxuanNeural.
+> At the moment, `styledegree` only supports Chinese (Mandarin, Simplified) neural voices. `role` only supports zh-CN-XiaomoNeural and zh-CN-XiaoxuanNeural.
**Attributes** | Attribute | Description | Required / Optional | |--|-|| | `style` | Specifies the speaking style. Currently, speaking styles are voice-specific. | Required if adjusting the speaking style for a neural voice. If using `mstts:express-as`, then style must be provided. If an invalid value is provided, this element will be ignored. |
-| `styledegree` | Specifies the intensity of speaking style. **Accepted values**: 0.01 to 2 inclusive. The default value is 1 which means the predefined style intensity. The minimum unit is 0.01 which results in a slightly tendency for the target style. A value of 2 results in a doubling of the default style intensity. | Optional (At the moment, `styledegree` only supports zh-CN-XiaoxiaoNeural.)|
-| `role` | Specifies the speaking role-play. The voice will act as a different age and gender. | Optional (At the moment, `role` only supports zh-CN-XiaomoNeural and zh-CN-XiaoxuanNeural.)|
+| `styledegree` | Specifies the intensity of speaking style. **Accepted values**: 0.01 to 2 inclusive. The default value is 1 which means the predefined style intensity. The minimum unit is 0.01 which results in a slightly tendency for the target style. A value of 2 results in a doubling of the default style intensity. | Optional (At the moment, `styledegree` only supports Chinese (Mandarin, Simplified) neural voices.)|
+| `role` | Specifies the speaking role-play. The voice will act as a different age and gender, but the voice name will not be changed. | Optional (At the moment, `role` only supports zh-CN-XiaomoNeural and zh-CN-XiaoxuanNeural.)|
Use this table to determine which speaking styles are supported for each neural voice.
Use this table to determine which speaking styles are supported for each neural
| | `style="angry"` | Expresses an angry and annoyed tone, with lower pitch, higher intensity, and higher vocal energy. The speaker is in a state of being irate, displeased, and offended. | | | `style="fearful"` | Expresses a scared and nervous tone, with higher pitch, higher vocal energy, and faster rate. The speaker is in a state of tenseness and uneasiness. |
-Use this table to determine which roles are supported for each neural voice.
-
-| Voice | Role | Description |
-|-|-|-|
-| `zh-CN-XiaomoNeural` | `role="YoungAdultFemale"` | The voice imitates to a young adult female. |
-| | `role="OlderAdultMale"` | The voice imitates to an older adult male. |
-| | `role="Girl"` | The voice imitates to a girl. |
-| | `role="Boy"` | The voice imitates to a boy. |
-| `zh-CN-XiaoxuanNeural` | `role="YoungAdultFemale"` | The voice imitates to a young adult female. |
-| | `role="OlderAdultFemale"` | The voice imitates to an older adult female. |
-| | `role="OlderAdultMale"` | The voice imitates to an older adult male. |
+Use this table to check the supported roles and their definitions.
+
+|Role | Description |
+|-|-|
+|`role="Girl"` | The voice imitates to a girl. |
+|`role="Boy"` | The voice imitates to a boy. |
+|`role="YoungAdultFemale"`| The voice imitates to a young adult female.|
+|`role="YoungAdultMale"` | The voice imitates to a young adult male.|
+|`role="OlderAdultFemale"`| The voice imitates to an older adult female.|
+|`role="OlderAdultMale"` | The voice imitates to an older adult male.|
+|`role="SeniorFemale"` | The voice imitates to a senior female.|
+|`role="SeniorMale"` | The voice imitates to a senior male.|
+ **Example**
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/document-translation/overview.md
The following document file types are supported by Document Translation:
|Microsoft Word|.docx| A text document file.| |Tab Separated Values/TAB|.tsv/.tab| A tab-delimited raw-data file used by spreadsheet programs.| |Text|.txt| An unformatted text document.|
-|Translation Memory Exchange|.tmx|An open XML standard used for exchanging translation memory (TM) data created by Computer Aided Translation (CAT) and localization applications.|
## Supported glossary formats
cognitive-services V3 0 Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/reference/v3-0-reference.md
Microsoft Translator is served out of multiple datacenter locations. Currently t
* **Americas:** East US, South Central US, West Central US, and West US 2 * **Asia Pacific:** Korea South, Japan East, Southeast Asia, and Australia East
-* **Europe:** North Europe, West Europe, Switzerland North<sup>1,2</sup>, and Switzerland West<sup>1,2</sup>
+* **Europe:** North Europe, West Europe
Requests to the Microsoft Translator are in most cases handled by the datacenter that is closest to where the request originated. In case of a datacenter failure, the request may be routed outside of the Azure geography.
To force the request to be handled by a specific Azure geography, change the Glo
|Azure|Europe| api-eur.cognitive.microsofttranslator.com| |Azure|Asia Pacific| api-apc.cognitive.microsofttranslator.com|
-<sup>1</sup> Customer with a resource located in Switzerland North or Switzerland West can ensure that their Text API requests are served within Switzerland. To ensure that requests are handled in Switzerland, create the Translator resource in the ΓÇÿResource regionΓÇÖ ΓÇÿSwitzerland NorthΓÇÖ or ΓÇÿSwitzerland WestΓÇÖ, then use the resourceΓÇÖs custom endpoint in your API requests. For example: If you create a Translator resource in Azure portal with ΓÇÿResource regionΓÇÖ as ΓÇÿSwitzerland NorthΓÇÖ and your resource name is ΓÇÿmy-ch-nΓÇÖ then your custom endpoint is ΓÇ£https://my-ch-n.cognitiveservices.azure.comΓÇ¥. And a sample request to translate is:
+<sup>1</sup> Customers with a resource located in Switzerland North or Switzerland West can ensure that their Text API requests are served within Switzerland. To ensure that requests are handled in Switzerland, create the Translator resource in the ΓÇÿResource regionΓÇÖ ΓÇÿSwitzerland NorthΓÇÖ or ΓÇÿSwitzerland WestΓÇÖ, then use the resourceΓÇÖs custom endpoint in your API requests. For example: If you create a Translator resource in Azure portal with ΓÇÿResource regionΓÇÖ as ΓÇÿSwitzerland NorthΓÇÖ and your resource name is ΓÇÿmy-ch-nΓÇÖ then your custom endpoint is ΓÇ£https://my-ch-n.cognitiveservices.azure.comΓÇ¥. And a sample request to translate is:
```curl // Pass secret key and region using headers to a custom endpoint curl -X POST " my-ch-n.cognitiveservices.azure.com/translator/text/v3.0/translate?to=fr" \
curl -X POST " my-ch-n.cognitiveservices.azure.com/translator/text/v3.0/translat
-H "Content-Type: application/json" \ -d "[{'Text':'Hello'}]" -v ```
-<sup>2</sup>Custom Translator is not currently available in Switzerland.
+<sup>2</sup> Custom Translator is not currently available in Switzerland.
## Authentication
cognitive-services Cognitive Services Container Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/cognitive-services-container-support.md
Last updated 12/16/2020 keywords: on-premises, Docker, container, Kubernetes
-#As a potential customer, I want to know more about how Cognitive Services provides and supports Docker containers for each service.
+#Customer intent: As a potential customer, I want to know more about how Cognitive Services provides and supports Docker containers for each service.
# Azure Cognitive Services containers
cognitive-services Azure Container Instance Recipe https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/containers/azure-container-instance-recipe.md
Last updated 12/18/2020
-#As a potential customer, I want to know more about how Cognitive Services provides and supports Docker containers for each service.
+#Customer intent: As a potential customer, I want to know more about how Cognitive Services provides and supports Docker containers for each service.
# https://github.com/Azure/cognitiveservices-aci
cognitive-services Container Reuse Recipe https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/containers/container-reuse-recipe.md
Last updated 04/01/2020
-#As a potential customer, I want to know how to configure containers so I can reuse them.
+#Customer intent: As a potential customer, I want to know how to configure containers so I can reuse them.
# SME: Siddhartha Prasad <siprasa@microsoft.com>
cognitive-services Docker Compose Recipe https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/containers/docker-compose-recipe.md
Last updated 10/29/2020
-#As a potential customer, I want to know how to configure containers so I can reuse them.
+#Customer intent: As a potential customer, I want to know how to configure containers so I can reuse them.
# SME: Brendan Walsh
cognitive-services Concept Custom https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/concept-custom.md
With Form Recognizer, you can train a model that will extract information from f
At a high level, the steps for building, training, and using your custom model are as follows: > [!div class="nextstepaction"]
->Assemble your training dataset](build-training-data-set.md#custom-model-input-requirements)
+>[&#120783;. Assemble your training dataset](build-training-data-set.md#custom-model-input-requirements)
Building a custom model begins with establishing your training dataset. You'll need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types and contain both text and handwriting. Your forms must be of the same type of document and follow the [input requirements](build-training-data-set.md#custom-model-input-requirements) for Form Recognizer.
-&emsp;&emsp;&emsp;&emsp;&emsp;&emsp;&#129155;
> [!div class="nextstepaction"]
-> [Upload your training dataset](build-training-data-set.md#upload-your-training-data)
+> [&#120784;. Upload your training dataset](build-training-data-set.md#upload-your-training-data)
You'll need to upload your training data to an Azure blob storage container. If you don't know how to create an Azure storage account with a container, *see* [Azure Storage quickstart for Azure portal](../../storage/blobs/storage-quickstart-blobs-portal.md). Use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
-&emsp;&emsp;&emsp;&emsp;&emsp;&emsp;&#129155;
+ > [!div class="nextstepaction"]
->[Train your custom model](quickstarts/client-library.md#train-a-custom-model)
+>[&#120785;. Train your custom model](quickstarts/client-library.md#train-a-custom-model)
You can train your model [without](quickstarts/client-library.md#train-a-model-without-labels) or [with](quickstarts/client-library.md#train-a-model-with-labels) labeled data sets. Unlabeled datasets rely solely on the Layout API to detect and identify key information without added human input. Labeled datasets also rely on the Layout API, but supplementary human input is included such as your specific labels and field locations. To use both labeled and unlabeled data, start with at least five completed forms of the same type for the labeled training data and then add unlabeled data to the required data set.
-&emsp;&emsp;&emsp;&emsp;&emsp;&emsp;&#129155;
>[!div class="nextstepaction"]
->[Analyze documents with your custom model](quickstarts/client-library.md#analyze-forms-with-a-custom-model)
+>[&#120786;. Analyze documents with your custom model](quickstarts/client-library.md#analyze-forms-with-a-custom-model)
Test your newly trained model by using a form that wasn't part of the training dataset. You can continue to do further training to improve the performance of your custom model.
-&emsp;&emsp;&emsp;&emsp;&emsp;&emsp;&#129155;
> [!div class="nextstepaction"]
->[Manage your custom models](quickstarts/client-library.md#manage-custom-models)
+>[&#120787;. Manage your custom models](quickstarts/client-library.md#manage-custom-models)
At any time, you can view a list of all the custom models under your subscription, retrieve information about a specific custom model, or delete a custom model from your account.
communication-services Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/reference.md
# Reference documentation overview + The following table details the available Communication Services packages along with corresponding reference documentation: <!--note that this table also exists here and should be synced: https://github.com/Azure/Communication/blob/master/README.md -->
communication-services Teams Interop https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/teams-interop.md
# Teams interoperability + > [!IMPORTANT] > To enable/disable [Teams tenant interoperability](../concepts/teams-interop.md), complete [this form](https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR21ouQM6BHtHiripswZoZsdURDQ5SUNQTElKR0VZU0VUU1hMOTBBMVhESS4u).
communication-services Get Started Teams Interop https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/get-started-teams-interop.md
zone_pivot_groups: acs-plat-web-ios-android
# Quickstart: Join your calling app to a Teams meeting + > [!IMPORTANT] > To enable/disable [Teams tenant interoperability](../../concepts/teams-interop.md), complete [this form](https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR21ouQM6BHtHiripswZoZsdURDQ5SUNQTElKR0VZU0VUU1hMOTBBMVhESS4u).
communication-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/samples/overview.md
# Samples + Azure Communication Services has many samples available, which you can use to test out ACS services and features before creating your own application or use case. ## Application samples
container-instances Container Instances Volume Azure Files https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-instances/container-instances-volume-azure-files.md
Title: Mount Azure Files volume to container group description: Learn how to mount an Azure Files volume to persist state with Azure Container Instances Previously updated : 07/02/2020 Last updated : 03/24/2021
By default, Azure Container Instances are stateless. If the container is restarted, crashes, or stops, all of its state is lost. To persist state beyond the lifetime of the container, you must mount a volume from an external store. As shown in this article, Azure Container Instances can mount an Azure file share created with [Azure Files](../storage/files/storage-files-introduction.md). Azure Files offers fully managed file shares hosted in Azure Storage that are accessible via the industry standard Server Message Block (SMB) protocol. Using an Azure file share with Azure Container Instances provides file-sharing features similar to using an Azure file share with Azure virtual machines.
+## Limitations
+
+* You can only mount Azure Files shares to Linux containers. Review more about the differences in feature support for Linux and Windows container groups in the [overview](container-instances-overview.md#linux-and-windows-containers).
+* Azure file share volume mount requires the Linux container run as *root* .
+* Azure File share volume mounts are limited to CIFS support.
+ > [!NOTE]
-> Mounting an Azure Files share is currently restricted to Linux containers. Find current platform differences in the [overview](container-instances-overview.md#linux-and-windows-containers).
->
-> Mounting an Azure Files share to a container instance is similar to a Docker [bind mount](https://docs.docker.com/storage/bind-mounts/). Be aware that if you mount a share into a container directory in which files or directories exist, these files or directories are obscured by the mount and are not accessible while the container runs.
+> Mounting an Azure Files share to a container instance is similar to a Docker [bind mount](https://docs.docker.com/storage/bind-mounts/). If you mount a share into a container directory in which files or directories exist, the mount obscures files or directories, making them inaccessible while the container runs.
> > [!IMPORTANT]
container-registry Container Registry Event Grid Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-event-grid-quickstart.md
description: In this quickstart, you enable Event Grid events for your container
Last updated 08/23/2018
-# Customer intent: As a container registry owner, I want to send events to Event Grid
-# when container images are pushed to or deleted from my container registry so that
-# downstream applications can react to those events.
+# Customer intent: As a container registry owner, I want to send events to Event Grid when container images are pushed to or deleted from my container registry so that downstream applications can react to those events.
# Quickstart: Send events from private container registry to Event Grid
container-registry Container Registry Tutorial Base Image Update https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-tutorial-base-image-update.md
description: In this tutorial, you learn how to configure an Azure Container Reg
Last updated 11/24/2020
-# Customer intent: As a developer or devops engineer, I want container
-# images to be built automatically when the base image of a container is
-# updated in the registry.
+# Customer intent: As a developer or devops engineer, I want container images to be built automatically when the base image of a container is updated in the registry.
# Tutorial: Automate container image builds when a base image is updated in an Azure container registry
container-registry Container Registry Tutorial Build Task https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-tutorial-build-task.md
description: In this tutorial, you learn how to configure an Azure Container Reg
Last updated 11/24/2020
-# Customer intent: As a developer or devops engineer, I want to trigger
-# container image builds automatically when I commit code to a Git repo.
+# Customer intent: As a developer or devops engineer, I want to trigger container image builds automatically when I commit code to a Git repo.
# Tutorial: Automate container image builds in the cloud when you commit source code
container-registry Container Registry Tutorial Multistep Task https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-tutorial-multistep-task.md
description: In this tutorial, you learn how to configure an Azure Container Reg
Last updated 11/24/2020
-# Customer intent: As a developer or devops engineer, I want to trigger
-# a multi-step container workflow automatically when I commit code to a Git repo.
+# Customer intent: As a developer or devops engineer, I want to trigger a multi-step container workflow automatically when I commit code to a Git repo.
# Tutorial: Run a multi-step container workflow in the cloud when you commit source code
container-registry Container Registry Tutorial Quick Task https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-tutorial-quick-task.md
description: In this tutorial, you learn how to build a Docker container image i
Last updated 11/24/2020
-# Customer intent: As a developer or devops engineer, I want to quickly build
-# container images in Azure, without having to install dependencies like Docker
-# Engine, so that I can simplify my inner-loop development pipeline.
+# Customer intent: As a developer or devops engineer, I want to quickly build container images in Azure, without having to install dependencies like Docker Engine, so that I can simplify my inner-loop development pipeline.
# Tutorial: Build and deploy container images in the cloud with Azure Container Registry Tasks
cosmos-db Cassandra Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cassandra-troubleshoot.md
Title: Troubleshoot common errors in Azure Cosmos DB Cassandra API
-description: This doc discusses the ways to troubleshoot common issues encountered in Azure Cosmos DB Cassandra API
+ Title: Troubleshoot common errors in the Azure Cosmos DB Cassandra API
+description: This article discusses common issues in the Azure Cosmos DB Cassandra API and how to troubleshoot them.
-# Troubleshoot common issues in Azure Cosmos DB Cassandra API
+# Troubleshoot common issues in the Azure Cosmos DB Cassandra API
+ [!INCLUDE[appliesto-cassandra-api](includes/appliesto-cassandra-api.md)]
-Cassandra API in Azure Cosmos DB is a compatibility layer, which provides [wire protocol support](cassandra-support.md) for the popular open-source Apache Cassandra database, and is powered by [Azure Cosmos DB](./introduction.md). As a fully managed cloud-native service, Azure Cosmos DB provides [guarantees on availability, throughput, and consistency](https://azure.microsoft.com/support/legal/sla/cosmos-db/v1_3/) for Cassandra API. These guarantees are not possible in legacy implementations of Apache Cassandra. Cassandra API also facilitates zero-maintenance platform operations, and zero-downtime patching. As such, many of it's backend operations are different from Apache Cassandra, so we recommend particular settings and approaches to avoid common errors.
+The Cassandra API in [Azure Cosmos DB](./introduction.md) is a compatibility layer that provides [wire protocol support](cassandra-support.md) for the open-source Apache Cassandra database.
+
+This article describes common errors and solutions for applications that use the Azure Cosmos DB Cassandra API. If your error isn't listed and you experience an error when you execute a [supported operation in Cassandra](cassandra-support.md), but the error isn't present when using native Apache Cassandra, [create an Azure support request](../azure-portal/supportability/how-to-create-azure-support-request.md).
-This article describes common errors and solutions for applications consuming Azure Cosmos DB Cassandra API. If your error is not listed below, and you are experiencing an error when executing a [supported operation in Cassandra API](cassandra-support.md), where the error is *not present when using native Apache Cassandra*, [create an Azure support request](../azure-portal/supportability/how-to-create-azure-support-request.md).
+>[!NOTE]
+>As a fully managed cloud-native service, Azure Cosmos DB provides [guarantees on availability, throughput, and consistency](https://azure.microsoft.com/support/legal/sla/cosmos-db/v1_3/) for the Cassandra API. The Cassandra API also facilitates zero-maintenance platform operations and zero-downtime patching.
+>
+>These guarantees aren't possible in previous implementations of Apache Cassandra, so many of the Cassandra API back-end operations differ from Apache Cassandra. We recommend particular settings and approaches to help avoid common errors.
## NoNodeAvailableException
-This is a top-level wrapper exception with a large number of possible causes and inner exceptions, many of which can be client-related.
-### Solution
-Some popular causes and solutions are as follows:
-- Idle timeout of Azure LoadBalancers: This may also manifest as `ClosedConnectionException`. To resolve this, set keep alive setting in driver (see [below](#enable-keep-alive-for-java-driver)) and increase keep-alive settings in operating system, or [adjust idle timeout in Azure Load Balancer](../load-balancer/load-balancer-tcp-idle-timeout.md?tabs=tcp-reset-idle-portal). -- **Client application resource exhaustion:** ensure that client machines have sufficient resources to complete the request.
-## Cannot connect to host
-You may see this error: `Cannot connect to any host, scheduling retry in 600000 milliseconds`.
+This error is a top-level wrapper exception with a large number of possible causes and inner exceptions, many of which can be client related.
+
+Common causes and solutions:
+
+- **Idle timeout of Azure LoadBalancers**: This issue might also manifest as `ClosedConnectionException`. To resolve the issue, set the keep-alive setting in the driver (see [Enable keep-alive for the Java driver](#enable-keep-alive-for-the-java-driver)) and increase keep-alive settings in your operating system, or [adjust idle timeout in Azure Load Balancer](../load-balancer/load-balancer-tcp-idle-timeout.md?tabs=tcp-reset-idle-portal).
+
+- **Client application resource exhaustion**: Ensure that client machines have sufficient resources to complete the request.
-### Solution
-This could be SNAT exhaustion on the client-side. Please follow the steps at [SNAT for outbound connections](../load-balancer/load-balancer-outbound-connections.md) to rule out this issue. This may also be an idle timeout issue where the Azure load balancer has 4 minutes of idle timeout by default. See documentation at [Load balancer idle timeout](../load-balancer/load-balancer-tcp-idle-timeout.md?tabs=tcp-reset-idle-portal). Enable tcp-keep alive from the driver settings (see [below](#enable-keep-alive-for-java-driver)) and set the `keepAlive` interval on the operating system to less than 4 minutes.
+## Can't connect to a host
-
+You might see this error: "Cannot connect to any host, scheduling retry in 600000 milliseconds."
+
+This error might be caused by source network address translation (SNAT) exhaustion on the client side. Follow the steps at [SNAT for outbound connections](../load-balancer/load-balancer-outbound-connections.md) to rule out this issue.
+
+The error might also be an idle timeout issue where the Azure load balancer has four minutes of idle timeout by default. See [Load balancer idle timeout](../load-balancer/load-balancer-tcp-idle-timeout.md?tabs=tcp-reset-idle-portal). [Enable keep-alive for the Java driver](#enable-keep-alive-for-the-java-driver) and set the `keepAlive` interval on the operating system to less than four minutes.
## OverloadedException (Java)
-The total number of request units consumed is more than the request-units provisioned on the keyspace or table. So the requests are throttled.
-### Solution
-Consider scaling the throughput assigned to a keyspace or table from the Azure portal (see [here](manage-scale-cassandra.md) for scaling operations in Cassandra API) or you can implement a retry policy. For Java, see retry samples for [v3.x driver](https://github.com/Azure-Samples/azure-cosmos-cassandra-java-retry-sample) and [v4.x driver](https://github.com/Azure-Samples/azure-cosmos-cassandra-java-retry-sample-v4). See also [Azure Cosmos Cassandra Extensions for Java](https://github.com/Azure/azure-cosmos-cassandra-extensions).
-### OverloadedException even with sufficient throughput
-The system appears to be throttling requests despite sufficient throughput being provisioned for request volume and/or consumed request unit cost. There are two possible causes of unexpected rate limiting:
-- **Schema level operations:** Cassandra API implements a system throughput budget for schema-level operations (CREATE TABLE, ALTER TABLE, DROP TABLE). This budget should be enough for schema operations in a production system. However, if you have a high number of schema-level operations, it is possible you are exceeding this limit. As this budget is not user-controlled, you will need to consider lowering the number of schema operations being run. If taking this action does not resolve the issue, or it is not feasible for your workload, [create an Azure support request](../azure-portal/supportability/how-to-create-azure-support-request.md).-- **Data skew:** when throughput is provisioned in Cassandra API, it is divided equally among physical partitions, and each physical partition has an upper limit. If you have a high amount of data being inserted or queried from one particular partition, it is possible to be rate-limited despite provisioning a large amount of overall throughput (request units) for that table. Review your data model and ensure you do not have excessive skew that could be causing hot partitions.
+Requests are throttled because the total number of request units consumed is higher than the number of request units that you provisioned on the keyspace or table.
+
+Consider scaling the throughput assigned to a keyspace or table from the Azure portal (see [Elastically scale an Azure Cosmos DB Cassandra API account](manage-scale-cassandra.md)) or implementing a retry policy.
+
+For Java, see retry samples for the [v3.x driver](https://github.com/Azure-Samples/azure-cosmos-cassandra-java-retry-sample) and the [v4.x driver](https://github.com/Azure-Samples/azure-cosmos-cassandra-java-retry-sample-v4). See also [Azure Cosmos Cassandra Extensions for Java](https://github.com/Azure/azure-cosmos-cassandra-extensions).
+
+### OverloadedException despite sufficient throughput
+
+The system seems to be throttling requests even though enough throughput is provisioned for request volume or consumed request unit cost. There are two possible causes:
+
+- **Schema level operations**: The Cassandra API implements a system throughput budget for schema-level operations (CREATE TABLE, ALTER TABLE, DROP TABLE). This budget should be enough for schema operations in a production system. However, if you have a high number of schema-level operations, you might exceed this limit.
+
+ Because the budget isn't user-controlled, consider lowering the number of schema operations that you run. If that action doesn't resolve the issue or it isn't feasible for your workload, [create an Azure support request](../azure-portal/supportability/how-to-create-azure-support-request.md).
+
+- **Data skew**: When throughput is provisioned in the Cassandra API, it's divided equally between physical partitions, and each physical partition has an upper limit. If you have a high amount of data being inserted or queried from one particular partition, it might be rate-limited even if you provision a large amount of overall throughput (request units) for that table.
+
+ Review your data model and ensure you don't have excessive skew that might cause hot partitions.
+
+## Intermittent connectivity errors (Java)
-## Intermittent connectivity errors (Java)
Connection drops or times out unexpectedly.
-### Solution
-The Apache Cassandra drivers for Java provide two native reconnection policies: `ExponentialReconnectionPolicy` and `ConstantReconnectionPolicy`. The default is `ExponentialReconnectionPolicy`. However, for Azure Cosmos DB Cassandra API, we recommend `ConstantReconnectionPolicy` with a delay of 2 seconds. See the [driver documentation](https://docs.datastax.com/en/developer/java-driver/4.9/manual/core/reconnection/) for Java v4.x driver, and [here](https://docs.datastax.com/en/developer/java-driver/3.7/manual/reconnection/) for Java 3.x guidance see also [Configuring ReconnectionPolicy for Java Driver](#configuring-reconnectionpolicy-for-java-driver) examples below.
+The Apache Cassandra drivers for Java provide two native reconnection policies: `ExponentialReconnectionPolicy` and `ConstantReconnectionPolicy`. The default is `ExponentialReconnectionPolicy`. However, for Azure Cosmos DB Cassandra API, we recommend `ConstantReconnectionPolicy` with a two-second delay.
+
+See the [documentation for the Java 4.x driver](https://docs.datastax.com/en/developer/java-driver/4.9/manual/core/reconnection/), the [documentation for the Java 3.x driver](https://docs.datastax.com/en/developer/java-driver/3.7/manual/reconnection/), or [Configuring ReconnectionPolicy for the Java driver](#configure-reconnectionpolicy-for-the-java-driver) examples.
## Error with load-balancing policy
-If you have implemented a load-balancing policy in v3.x of the Java Datastax driver, with code similar to the below:
+You might have implemented a load-balancing policy in v3.x of the Java DataStax driver, with code similar to:
```java cluster = Cluster.builder()
cluster = Cluster.builder()
.build(); ```
-If the value for `withLocalDc()` does not match the contact point datacenter, you may experience a very intermittent error: `com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)`.
+If the value for `withLocalDc()` doesn't match the contact point datacenter, you might experience an intermittent error: `com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)`.
-### Solution
-Implement [CosmosLoadBalancingPolicy](https://github.com/Azure/azure-cosmos-cassandra-extensions/blob/master/package/src/main/java/com/microsoft/azure/cosmos/cassandra/CosmosLoadBalancingPolicy.java) (you may need to upgrade datastax minor version to make it work):
+Implement the [CosmosLoadBalancingPolicy](https://github.com/Azure/azure-cosmos-cassandra-extensions/blob/master/package/src/main/java/com/microsoft/azure/cosmos/cassandra/CosmosLoadBalancingPolicy.java). To make it work, you might need to upgrade DataStax by using the following code:
```java LoadBalancingPolicy loadBalancingPolicy = new CosmosLoadBalancingPolicy.Builder().withWriteDC("West US").withReadDC("West US").build(); ```
-## Count fails on large table
-When running `select count(*) from table` or similar for a large number of rows, the server times out.
+## The count fails on a large table
-### Solution
-If using a local CQLSH client you can try to change the `--connect-timeout` or `--request-timeout` settings (see more details [here](https://cassandra.apache.org/doc/latest/tools/cqlsh.html)). If this is not sufficient and count still times out, you can get a count of records from the Azure Cosmos DB backend telemetry by going to metrics tab in Azure portal, selecting the metric `document count`, then adding a filter for the database or collection (the analog of table in Azure Cosmos DB). You can then hover over the resulting graph for the point in time at which you want a count of the number of records.
+When you run `select count(*) from table` or similar for a large number of rows, the server times out.
+If you're using a local CQLSH client, change the `--connect-timeout` or `--request-timeout` settings. See [cqlsh: the CQL shell](https://cassandra.apache.org/doc/latest/tools/cqlsh.html).
+If the count still times out, you can get a count of records from the Azure Cosmos DB back-end telemetry by going to the metrics tab in the Azure portal, selecting the metric `document count`, and then adding a filter for the database or collection (the analog of the table in Azure Cosmos DB). You can then hover over the resulting graph for the point in time at which you want a count of the number of records.
+
-## Configuring ReconnectionPolicy for Java Driver
+## Configure ReconnectionPolicy for the Java driver
### Version 3.x
-For version 3.x of the Java driver, configure the reconnection policy when creating a cluster object:
+For version 3.x of the Java driver, configure the reconnection policy when you create a cluster object:
```java import com.datastax.driver.core.policies.ConstantReconnectionPolicy;
Cluster.builder()
### Version 4.x
-For version 4.x of the Java driver, configure the reconnection policy by overriding settings in `reference.conf` file:
+For version 4.x of the Java driver, configure the reconnection policy by overriding settings in the `reference.conf` file:
```xml datastax-java-driver {
datastax-java-driver {
} ```
-## Enable keep-alive for Java Driver
+## Enable keep-alive for the Java driver
### Version 3.x
-For version 3.x of the Java driver, set keep-alive when creating a Cluster object, and ensure keep-alive is [enabled in the operating system](https://knowledgebase.progress.com/articles/Article/configure-OS-TCP-KEEPALIVE-000080089):
+For version 3.x of the Java driver, set keep-alive when you create a cluster object, and then ensure that keep-alive is [enabled in the operating system](https://knowledgebase.progress.com/articles/Article/configure-OS-TCP-KEEPALIVE-000080089):
```java import java.net.SocketOptions;
cluster = Cluster.builder().addContactPoints(contactPoints).withPort(port)
### Version 4.x
-For version 4.x of the Java driver, set keep-alive by overriding settings in `reference.conf` and ensure keep-alive is [enabled in the operating system](https://knowledgebase.progress.com/articles/Article/configure-OS-TCP-KEEPALIVE-000080089):
+For version 4.x of the Java driver, set keep-alive by overriding settings in `reference.conf`, and then ensure that keep-alive is [enabled in the operating system](https://knowledgebase.progress.com/articles/Article/configure-OS-TCP-KEEPALIVE-000080089):
```xml datastax-java-driver {
datastax-java-driver {
## Next steps -- Learn about the [supported features](cassandra-support.md) in Azure Cosmos DB Cassandra API.-- Learn how to [migrate from native Apache Cassandra to Azure Cosmos DB Cassandra API](cassandra-migrate-cosmos-db-databricks.md)
+- Learn about [supported features](cassandra-support.md) in the Azure Cosmos DB Cassandra API.
+- Learn how to [migrate from native Apache Cassandra to Azure Cosmos DB Cassandra API](cassandra-migrate-cosmos-db-databricks.md).
cosmos-db Create Graph Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-graph-python.md
ms.devlang: python Previously updated : 01/22/2019 Last updated : 03/29/2021
Now let's switch to working with code. Let's clone a Gremlin API app from GitHub
cd "C:\git-samples" ```
-3. Run the following command to clone the sample repository. This command creates a copy of the sample app on your computer.
+3. Run the following command to clone the sample repository. This command creates a copy of the sample app on your computer.
```bash git clone https://github.com/Azure-Samples/azure-cosmos-db-graph-python-getting-started.git
Now let's switch to working with code. Let's clone a Gremlin API app from GitHub
## Review the code
-This step is optional. If you're interested in learning how the database resources are created in the code, you can review the following snippets. The snippets are all taken from the *connect.py* file in the *C:\git-samples\azure-cosmos-db-graph-python-getting-started\\* folder. Otherwise, you can skip ahead to [Update your connection string](#update-your-connection-information).
+This step is optional. If you're interested in learning how the database resources are created in the code, you can review the following snippets. The snippets are all taken from the *connect.py* file in the *C:\git-samples\azure-cosmos-db-graph-python-getting-started\\* folder. Otherwise, you can skip ahead to [Update your connection string](#update-your-connection-information).
-* The Gremlin `client` is initialized in line 104 in *connect.py*:
+* The Gremlin `client` is initialized in line 104 in *connect.py*. Make sure to replace `<YOUR_DATABASE>` and `<YOUR_CONTAINER_OR_GRAPH>` with the values of your account's database name and graph name:
```python ... client = client.Client('wss://<YOUR_ENDPOINT>.gremlin.cosmosdb.azure.com:443/','g',
- username="/dbs/<YOUR_DATABASE>/colls/<YOUR_COLLECTION_OR_GRAPH>",
+ username="/dbs/<YOUR_DATABASE>/colls/<YOUR_CONTAINER_OR_GRAPH>",
password="<YOUR_PASSWORD>") ... ```
cosmos-db How To Setup Rbac https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-setup-rbac.md
description: Learn how to configure role-based access control with Azure Active
Previously updated : 03/24/2021 Last updated : 03/30/2021
This additional information flows in the **DataPlaneRequests** log category and
## Limits - You can create up to 100 role definitions and 2,000 role assignments per Azure Cosmos DB account.
+- You can only assign role definitions to Azure AD identities belonging to the same Azure AD tenant as your Azure Cosmos DB account.
- Azure AD group resolution is not currently supported for identities that belong to more than 200 groups. - The Azure AD token is currently passed as a header with each individual request sent to the Azure Cosmos DB service, increasing the overall payload size. - Accessing your data with Azure AD through the [Azure Cosmos DB Explorer](data-explorer.md) isn't supported yet. Using the Azure Cosmos DB Explorer still requires the user to have access to the account's primary key for now.
cosmos-db Performance Tips Dotnet Sdk V3 Sql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/performance-tips-dotnet-sdk-v3-sql.md
If you're testing at high throughput levels, or at rates that are greater than 5
**Connection policy: Use direct connection mode**
-.NET V3 SDK default connection mode is direct. You configure the connection mode when you create the `CosmosClient` instance in `CosmosClientOptions`. To learn more about different connectivity options, see the [connectivity modes](sql-sdk-connection-modes.md) article.
+.NET V3 SDK default connection mode is direct with TCP protocol. You configure the connection mode when you create the `CosmosClient` instance in `CosmosClientOptions`. To learn more about different connectivity options, see the [connectivity modes](sql-sdk-connection-modes.md) article.
```csharp string connectionString = "<your-account-connection-string>";
cost-management-billing Cost Mgt Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/costs/cost-mgt-best-practices.md
To learn more about the various options, visit [How to buy Azure](https://azure.
#### [Free](https://azure.microsoft.com/free/) - 12 months of popular free services-- $200 in credit to explore services for 30 days
+- 200 USD credit in your billing currency to explore services for 30 days
- 25+ services are always free #### [Pay as you go](https://azure.microsoft.com/offers/ms-azr-0003p)
cost-management-billing Avoid Charges Free Account https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/avoid-charges-free-account.md
tags: billing
Previously updated : 12/04/2020 Last updated : 03/30/2021 # Avoid charges with your Azure free account
-Eligible new users get $200 of Azure credit for the first 30 days and a limited quantity of free services for 12 months with your [Azure free account](https://azure.microsoft.com/free/). To learn about limits of free services, see the [Azure free account FAQ](https://azure.microsoft.com/free/free-account-faq/). As long as you have unexpired credit or you use only free services within the limits, you're not charged.
+Eligible new users get 200 USD Azure credit in your billing currency for the first 30 days and a limited quantity of free services for 12 months with your [Azure free account](https://azure.microsoft.com/free/). To learn about limits of free services, see the [Azure free account FAQ](https://azure.microsoft.com/free/free-account-faq/). As long as you have unexpired credit or you use only free services within the limits, you're not charged.
Let's look at some of the reasons you can incur charges on your Azure free account.
cost-management-billing Create Free Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/create-free-services.md
Previously updated : 12/04/2020 Last updated : 03/30/2021 # Create services included with Azure free account
-During the first 30 days after you've created an Azure free account, you have a $200 credit to use on any service, except for third-party Marketplace purchases. You can experiment with different tiers and types of Azure services using the free credit to try out Azure. If you use services or Azure resources that arenΓÇÖt free during that time, charges are deducted against your credit.
+During the first 30 days after you've created an Azure free account, you have 200 USD credit in your billing currency to use on any service, except for third-party Marketplace purchases. You can experiment with different tiers and types of Azure services using the free credit to try out Azure. If you use services or Azure resources that arenΓÇÖt free during that time, charges are deducted against your credit.
If you donΓÇÖt use all of your credit by the end of the first 30 days, it's lost. After the first 30 days and up to 12 months after sign-up, you can only use a limited quantity of *some services*ΓÇönot all Azure services are free. If you upgrade before 30 days and have remaining credit, you can use the rest of your credit with a pay-as-you-go subscription for the remaining days. For example, if you sign up for the free account on November 1 and upgrade on November 5, you have until November 30 to use your credit in the new pay-as-you-go subscription.
cost-management-billing Programmatically Create Subscription Enterprise Agreement https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/programmatically-create-subscription-enterprise-agreement.md
When you create an Azure subscription programmatically, that subscription is gov
You must have an Owner role on an Enrollment Account to create a subscription. There are two ways to get the role: * The Enterprise Administrator of your enrollment can [make you an Account Owner](https://ea.azure.com/helpdocs/addNewAccount) (sign in required) which makes you an Owner of the Enrollment Account.
-* An existing Owner of the Enrollment Account can [grant you access](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put). Similarly, to use a service principal to create an EA subscription, you must [grant that service principal the ability to create subscriptions](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put).
+* An existing Owner of the Enrollment Account can [grant you access](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put). Similarly, to use a service principal to create an EA subscription, you must [grant that service principal the ability to create subscriptions](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put).
+ If you're using an SPN to create subscriptions, use the ObjectId of the Azure AD Application Registration as the Service Principal ObjectId using [Azure Active Directory PowerShell](/powershell/module/azuread/get-azureadserviceprincipal?view=azureadps-2.0) or [Azure CLI](/cli/azure/ad/sp?view=azure-cli-latest#az_ad_sp_list).
> [!NOTE] > Ensure that you use the correct API version to give the enrollment account owner permissions. For this article and for the APIs documented in it, use the [2019-10-01-preview](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put) API. If you're migrating to use the newer APIs, you must grant owner permission again using [2019-10-01-preview](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put). Your previous configuration made with the [2015-07-01 version](grant-access-to-create-subscription.md) doesn't automatically convert for use with the newer APIs.
cost-management-billing Programmatically Create Subscription Microsoft Customer Agreement https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/programmatically-create-subscription-microsoft-customer-agreement.md
Previously updated : 03/12/2021 Last updated : 03/29/2021
When you create an Azure subscription programmatically, that subscription is gov
## Prerequisites
-You must have an owner, contributor, or Azure subscription creator role on an invoice section or owner or contributor role on a billing profile or a billing account to create subscriptions. For more information, see [Subscription billing roles and tasks](understand-mca-roles.md#subscription-billing-roles-and-tasks).
+You must have an owner, contributor, or Azure subscription creator role on an invoice section or owner or contributor role on a billing profile or a billing account to create subscriptions. You can also give the same role to a service principal name (SPN). For more information about roles and assigning permission to them, see [Subscription billing roles and tasks](understand-mca-roles.md#subscription-billing-roles-and-tasks).
+
+If you're using an SPN to create subscriptions, use the ObjectId of the Azure AD Application Registration as the Service Principal ObjectId using [Azure Active Directory PowerShell](/powershell/module/azuread/get-azureadserviceprincipal?view=azureadps-2.0) or [Azure CLI](/cli/azure/ad/sp?view=azure-cli-latest#az_ad_sp_list).
If you don't know whether you have access to a Microsoft Customer Agreement account, see [Check access to a Microsoft Customer Agreement](../understand/mca-overview.md#check-access-to-a-microsoft-customer-agreement).
cost-management-billing Review Enterprise Billing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/review-enterprise-billing.md
Last updated 08/20/2020
-# As an administrator or developer, I want to use REST APIs to review billing data for all subscriptions and departments in the enterprise enrollment.
+#Customer intent: As an administrator or developer, I want to use REST APIs to review billing data for all subscriptions and departments in the enterprise enrollment.
cost-management-billing Review Service Usage Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/review-service-usage-api.md
Last updated 08/20/2020
-# As an administrator or developer, I want to use REST APIs to review resource and service usage data under my control.
+# Customer intent: As an administrator or developer, I want to use REST APIs to review resource and service usage data under my control.
# Review Azure resource usage using the REST API
cost-management-billing Review Subscription Billing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/review-subscription-billing.md
Last updated 08/20/2020
-# As an administrator or developer, I want to use REST APIs to review subscription billing data for a specified period.
+# Customer intent: As an administrator or developer, I want to use REST APIs to review subscription billing data for a specified period.
# Review subscription billing using REST APIs
cost-management-billing Subscription Disabled https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/subscription-disabled.md
tags: billing
Previously updated : 01/19/2021 Last updated : 03/30/2021
Your Azure subscription can get disabled because your credit has expired, you re
## Your credit is expired
-When you sign up for an Azure free account, you get a Free Trial subscription, which provides you $200 in Azure credits for 30 days and 12 months of free services. At the end of 30 days, Azure disables your subscription. Your subscription is disabled to protect you from accidentally incurring charges for usage beyond the credit and free services included with your subscription. To continue using Azure services, you must [upgrade your subscription](upgrade-azure-subscription.md). After you upgrade, your subscription still has access to free services for 12 months. You only get charged for usage beyond the free service quantity limits.
+When you sign up for an Azure free account, you get a Free Trial subscription, which provides you 200 USD Azure credit in your billing currency for 30 days and 12 months of free services. At the end of 30 days, Azure disables your subscription. Your subscription is disabled to protect you from accidentally incurring charges for usage beyond the credit and free services included with your subscription. To continue using Azure services, you must [upgrade your subscription](upgrade-azure-subscription.md). After you upgrade, your subscription still has access to free services for 12 months. You only get charged for usage beyond the free service quantity limits.
## You reached your spending limit
cost-management-billing Upgrade Azure Subscription https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/upgrade-azure-subscription.md
You can upgrade your [Azure free account](https://azure.microsoft.com/free/) to [pay-as-you-go rates](https://azure.microsoft.com/offers/ms-azr-0003p/) in the Azure portal.
-If you have an [Azure for Students Starter account](https://azure.microsoft.com/offers/ms-azr-0144p/) and are eligible for an [Azure free account](https://azure.microsoft.com/free/), you can upgrade to it to a [Azure free account](https://azure.microsoft.com/free/). You'll get $200 of Azure credits and 12 months of free services on upgrade. If you don't qualify for a free account, you can upgrade to [pay-as-you-go rates](https://azure.microsoft.com/offers/ms-azr-0003p/) with a [support request](https://go.microsoft.com/fwlink/?linkid=2083458).
+If you have an [Azure for Students Starter account](https://azure.microsoft.com/offers/ms-azr-0144p/) and are eligible for an [Azure free account](https://azure.microsoft.com/free/), you can upgrade to it to a [Azure free account](https://azure.microsoft.com/free/). You'll get 200 USD Azure credit in your billing currency and 12 months of free services on upgrade. If you don't qualify for a free account, you can upgrade to [pay-as-you-go rates](https://azure.microsoft.com/offers/ms-azr-0003p/) with a [support request](https://go.microsoft.com/fwlink/?linkid=2083458).
If you have an [Azure for Students](https://azure.microsoft.com/offers/ms-azr-0170p/) account, you can upgrade to [pay-as-you-go rates](https://azure.microsoft.com/offers/ms-azr-0003p/) with a [support request](https://go.microsoft.com/fwlink/?linkid=2083458)
data-factory Connector Odata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-odata.md
description: Learn how to copy data from OData sources to supported sink data st
Previously updated : 10/14/2020 Last updated : 03/30/2021 # Copy data from an OData source by using Azure Data Factory
When you copy data from OData, the following mappings are used between OData dat
> [!NOTE] > OData complex data types (such as **Object**) aren't supported.
+## Copy data from Project Online
+
+To copy data from Project Online, you can use the OData connector and an access token obtained from tools like Postman.
+
+> [!CAUTION]
+> The access token expires in 1 hour by default, you need to get a new access token when it expires.
+
+1. Use **Postman** to get the access token:
+
+ 1. Navigate to **Authorization** tab on the Postman Website.
+ 1. In the **Type** box, select **OAuth 2.0**, and in the **Add authorization data to** box, select **Request Headers**.
+ 1. Fill the following information in the **Configure New Token** page to get a new access token:
+ - **Grant type**: Select **Authorization Code**.
+ - **Callback URL**: Enter `https://www.localhost.com/`. 
+ - **Auth URL**: Enter `https://login.microsoftonline.com/common/oauth2/authorize?resource=https://<your tenant name>.sharepoint.com`. Replace `<your tenant name>` with your own tenant name.
+ - **Access Token URL**: Enter `https://login.microsoftonline.com/common/oauth2/token`.
+ - **Client ID**: Enter your AAD service principal ID.
+ - **Client Secret**: Enter your service principal secret.
+ - **Client Authentication**: Select **Send as Basic Auth header**.
+
+ 1. You will be asked to login with your username and password.
+ 1. Once you get your access token, please copy and save it for the next step.
+
+ [![Use Postman to get the access token](./media/connector-odata/odata-project-online-postman-access-token-inline.png)](./media/connector-odata/odata-project-online-postman-access-token-expanded.png#lightbox)
+
+1. Create the OData linked service:
+ - **Service URL**: Enter `https://<your tenant name>.sharepoint.com/sites/pwa/_api/Projectdata`. Replace `<your tenant name>` with your own tenant name.
+ - **Authentication type**: Select **Anonymous**.
+ - **Auth headers**:
+ - **Property name**: Choose **Authorization**.
+ - **Value**: Enter the **access token** copied from step 1.
+ - Test the linked service.
+
+ ![Create OData linked service](./media/connector-odata/odata-project-online-linked-service.png)
+
+1. Create the OData dataset:
+ 1. Create the dataset with the OData linked service created in step 2.
+ 1. Preview data.
+
+ ![Preview data](./media/connector-odata/odata-project-online-preview-data.png)
+
+ ## Lookup activity properties
data-factory Data Flow Transformation Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-transformation-overview.md
Below is a list of the transformations currently supported in mapping data flow.
| [Union](data-flow-union.md) | Multiple inputs/outputs | Combine multiple data streams vertically | | [Unpivot](data-flow-unpivot.md) | Schema modifier | Pivot columns into row values | | [Window](data-flow-window.md) | Schema modifier | Define window-based aggregations of columns in your data streams. |
+| [Parse](data-flow-parse.md) | Schema modifier | Parse column data to Json or delimited text |
data-factory How To Create Event Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-event-trigger.md
This section shows you how to create a storage event trigger within the Azure Da
In the preceding example, the trigger is configured to fire when a blob path ending in .csv is created in the folder _event-testing_ in the container _sample-data_. The **folderPath** and **fileName** properties capture the location of the new blob. For example, when MoviesDB.csv is added to the path sample-data/event-testing, `@triggerBody().folderPath` has a value of `sample-data/event-testing` and `@triggerBody().fileName` has a value of `moviesDB.csv`. These values are mapped, in the example, to the pipeline parameters `sourceFolder` and `sourceFile`, which can be used throughout the pipeline as `@pipeline().parameters.sourceFolder` and `@pipeline().parameters.sourceFile` respectively. > [!NOTE]
- > If you are creating your pipeline and trigger in [Azure Synapse Analytics](https://docs.microsoft.com/azure/synapse-analytics/), you must use `@trigger().outputs.body.fileName` and `@trigger().outputs.body.folderPath` as parameters. Those two properties capture blob information. Use those properties instead of using `@triggerBody().fileName` and `@triggerBody().folderPath`.
-
- > [!NOTE]
- > If you are creating your pipeline and trigger in Azure Synapse Analytics you must use `@trigger().outputs.body.fileName` and `@trigger().outputs.body.folderPath` as parameters to capture blob information instead of `@triggerBody().fileName` and `@triggerBody().folderPath`.
+ > If you are creating your pipeline and trigger in [Azure Synapse Analytics](/synapse-analytics), you must use `@trigger().outputs.body.fileName` and `@trigger().outputs.body.folderPath` as parameters. Those two properties capture blob information. Use those properties instead of using `@triggerBody().fileName` and `@triggerBody().folderPath`.
1. Click **Finish** once you are done.
databox-online Azure Stack Edge Deploy Add Shares https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-add-shares.md
Last updated 01/04/2021
-Customer intent: As an IT admin, I need to understand how to add and connect to shares on Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to add and connect to shares on Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Transfer data with Azure Stack Edge Pro
databox-online Azure Stack Edge Deploy Configure Compute Advanced https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-configure-compute-advanced.md
Last updated 01/06/2021
-Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro for advanced deployment flow so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro for advanced deployment flow so I can use it to transform the data before sending it to Azure.
# Tutorial: Transform data with Azure Stack Edge Pro for advanced deployment flow
databox-online Azure Stack Edge Deploy Configure Compute https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-configure-compute.md
Last updated 01/06/2021
-Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
# Tutorial: Transform the data with Azure Stack Edge Pro
databox-online Azure Stack Edge Deploy Connect Setup Activate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-connect-setup-activate.md
Last updated 03/28/2019
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Connect, set up, and activate Azure Stack Edge Pro
databox-online Azure Stack Edge Deploy Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-install.md
Last updated 01/17/2020
-Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro in datacenter so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro in datacenter so I can use it to transfer data to Azure.
# Tutorial: Install Azure Stack Edge Pro
databox-online Azure Stack Edge Deploy Prep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-deploy-prep.md
Last updated 03/16/2021
-Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Prepare to deploy Azure Stack Edge Pro
databox-online Azure Stack Edge Gpu Deploy Activate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-activate.md
Last updated 10/07/2020
-Customer intent: As an IT admin, I need to understand how to activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Activate Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Add Shares https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-add-shares.md
Last updated 02/22/2021
-Customer intent: As an IT admin, I need to understand how to add and connect to shares on Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to add and connect to shares on Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Transfer data via shares with Azure Stack Edge Pro GPU
databox-online Azure Stack Edge Gpu Deploy Add Storage Accounts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-add-storage-accounts.md
Last updated 03/12/2021
-Customer intent: As an IT admin, I need to understand how to add and connect to storage accounts on Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to add and connect to storage accounts on Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Transfer data via storage accounts with Azure Stack Edge Pro GPU
databox-online Azure Stack Edge Gpu Deploy Compute Module Simple https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-compute-module-simple.md
Last updated 02/22/2021
-Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
# Tutorial: Run a compute workload with IoT Edge module on Azure Stack Edge Pro GPU
databox-online Azure Stack Edge Gpu Deploy Configure Certificates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-configure-certificates.md
Last updated 09/10/2020
-Customer intent: As an IT admin, I need to understand how to configure certificates for Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure certificates for Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Configure certificates for your Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Configure Compute https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-configure-compute.md
Last updated 03/08/2021
-Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
# Tutorial: Configure compute on Azure Stack Edge Pro GPU device
databox-online Azure Stack Edge Gpu Deploy Configure Network Compute Web Proxy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy.md
Last updated 02/04/2021
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Configure network for Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Connect https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-connect.md
Last updated 08/29/2020
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Connect to Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-install.md
Last updated 12/21/2020
-Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro in datacenter so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro in datacenter so I can use it to transfer data to Azure.
# Tutorial: Install Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Prep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-prep.md
Last updated 03/03/2021
-Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Prepare to deploy Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Set Up Device Update Time https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-set-up-device-update-time.md
Last updated 09/10/2020
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Configure the device settings for Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge Gpu Deploy Virtual Machine Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-virtual-machine-portal.md
Last updated 02/22/2021
-Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro device so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro device so I can use it to transform the data before sending it to Azure.
# Deploy VMs on your Azure Stack Edge Pro GPU device via the Azure portal
databox-online Azure Stack Edge Gpu Manage Virtual Machine Network Interfaces Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-manage-virtual-machine-network-interfaces-portal.md
Last updated 03/23/2021
-Customer intent: As an IT admin, I need to understand how to manage network interfaces on an Azure Stack Edge Pro device so that I can use it to run applications using Edge compute before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to manage network interfaces on an Azure Stack Edge Pro device so that I can use it to run applications using Edge compute before sending it to Azure.
# Use the Azure portal to manage network interfaces on the VMs on your Azure Stack Edge Pro GPU
databox-online Azure Stack Edge Gpu Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-quickstart.md
Last updated 01/27/2021
-Customer intent: As an IT admin, I need to understand how to prepare the portal to quickly deploy Azure Stack Edge so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to prepare the portal to quickly deploy Azure Stack Edge so I can use it to transfer data to Azure.
# Quickstart: Get started with Azure Stack Edge Pro with GPU
databox-online Azure Stack Edge J Series Deploy Configure Compute https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-j-series-deploy-configure-compute.md
Last updated 01/05/2021
-Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure compute on Azure Stack Edge Pro so I can use it to transform the data before sending it to Azure.
# Tutorial: Transform data with Azure Stack Edge Pro
databox-online Azure Stack Edge Mini R Deploy Activate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-activate.md
Last updated 10/22/2020
-Customer intent: As an IT admin, I need to understand how to activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
# Tutorial: Activate Azure Stack Edge Mini R
databox-online Azure Stack Edge Mini R Deploy Configure Certificates Vpn Encryption https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-configure-certificates-vpn-encryption.md
Last updated 10/21/2020
-Customer intent: As an IT admin, I need to understand how to configure certificates for Azure Stack Edge Mini R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure certificates for Azure Stack Edge Mini R so I can use it to transfer data to Azure.
# Tutorial: Configure certificates, VPN, encryption for your Azure Stack Edge Mini R
databox-online Azure Stack Edge Mini R Deploy Configure Network Compute Web Proxy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy.md
Last updated 02/04/2021
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
# Tutorial: Configure network for Azure Stack Edge Mini R
databox-online Azure Stack Edge Mini R Deploy Connect https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-connect.md
Last updated 10/20/2020
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
# Tutorial: Connect to Azure Stack Edge Mini R
databox-online Azure Stack Edge Mini R Deploy Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-install.md
Last updated 10/20/2020
-Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Mini R device in datacenter so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Mini R device in datacenter so I can use it to transfer data to Azure.
# Tutorial: Install Azure Stack Edge Mini R
databox-online Azure Stack Edge Mini R Deploy Prep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-prep.md
Last updated 01/22/2021
-Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Mini R device so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Mini R device so I can use it to transfer data to Azure.
# Tutorial: Prepare to deploy Azure Stack Edge Mini R
databox-online Azure Stack Edge Mini R Deploy Set Up Device Update Time https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-set-up-device-update-time.md
Last updated 10/14/2020
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
# Tutorial: Configure the device settings for Azure Stack Edge Mini R
databox-online Azure Stack Edge Pro R Deploy Activate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-activate.md
Last updated 02/23/2021
-Customer intent: As an IT admin, I need to understand how to activate Azure Stack Edge Pro R device so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to activate Azure Stack Edge Pro R device so I can use it to transfer data to Azure.
# Tutorial: Activate Azure Stack Edge Pro R device
databox-online Azure Stack Edge Pro R Deploy Configure Certificates Vpn Encryption https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-configure-certificates-vpn-encryption.md
Last updated 10/19/2020
-Customer intent: As an IT admin, I need to understand how to configure certificates for Azure Stack Edge Pro R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to configure certificates for Azure Stack Edge Pro R so I can use it to transfer data to Azure.
# Tutorial: Configure certificates for your Azure Stack Edge Pro R
databox-online Azure Stack Edge Pro R Deploy Configure Network Compute Web Proxy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-configure-network-compute-web-proxy.md
Last updated 02/04/2021
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro R so I can use it to transfer data to Azure.
# Tutorial: Configure network for Azure Stack Edge Pro R
databox-online Azure Stack Edge Pro R Deploy Connect https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-connect.md
Last updated 10/15/2020
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro R so I can use it to transfer data to Azure.
# Tutorial: Connect to Azure Stack Edge Pro R
databox-online Azure Stack Edge Pro R Deploy Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-install.md
Last updated 10/18/2020
-Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro R in datacenter so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to install Azure Stack Edge Pro R in datacenter so I can use it to transfer data to Azure.
# Tutorial: Install Azure Stack Edge Pro R
databox-online Azure Stack Edge Pro R Deploy Prep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-prep.md
Last updated 01/22/2021
-Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro R so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to prepare the portal to deploy Azure Stack Edge Pro R so I can use it to transfer data to Azure.
# Tutorial: Prepare to deploy Azure Stack Edge Pro R
databox-online Azure Stack Edge Pro R Deploy Set Up Device Update Time https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-set-up-device-update-time.md
Last updated 10/18/2020
-Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
+# Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
# Tutorial: Configure the device settings for Azure Stack Edge Pro R
databox Data Box Deploy Ordered https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox/data-box-deploy-ordered.md
Title: Tutorial to order Azure Data Box | Microsoft Docs
description: In this tutorial, learn about Azure Data Box, a hybrid solution that allows you to import on-premises data into Azure, and how to order Azure Data Box. -
To cancel an Azure Data Box order, run [`az databox job cancel`](/cli/azure/ext/
|resource-group [Required]| The name of the resource group associated with the order to be deleted. A resource group is a logical container for the resources that can be managed or deployed together. | "myresourcegroup"| |name [Required]| The name of the order to be deleted. | "mydataboxorder"| |reason [Required]| The reason for canceling the order. | "I entered erroneous information and needed to cancel the order." |
- |yes| Do not prompt for confirmation. | --yes (-y)| --yes -y |
+ |yes| Do not prompt for confirmation. | --yes (-y)|
|debug| Include debugging information to verbose logging | --debug | |help| Display help information for this command. | --help -h | |only-show-errors| Only show errors, suppressing warnings. | --only-show-errors |
If you have canceled an Azure Data Box order, you can run [`az databox job delet
|resource-group [Required]| The name of the resource group associated with the order to be deleted. A resource group is a logical container for the resources that can be managed or deployed together. | "myresourcegroup"| |name [Required]| The name of the order to be deleted. | "mydataboxorder"| |subscription| The name or ID (GUID) of your Azure subscription. | "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" |
- |yes| Do not prompt for confirmation. | --yes (-y)| --yes -y |
+ |yes| Do not prompt for confirmation. | --yes (-y)|
|debug| Include debugging information to verbose logging | --debug | |help| Display help information for this command. | --help -h | |only-show-errors| Only show errors, suppressing warnings. | --only-show-errors |
In this tutorial, you learned about Azure Data Box articles such as:
Advance to the next tutorial to learn how to set up your Data Box. > [!div class="nextstepaction"]
-> [Set up your Azure Data Box](./data-box-deploy-set-up.md)
+> [Set up your Azure Data Box](./data-box-deploy-set-up.md)
databox Data Box Disk Deploy Ordered https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox/data-box-disk-deploy-ordered.md
Last updated 07/03/2019
-Customer intent: As an IT admin, I need to be able to order Data Box Disk to upload on-premises data from my server onto Azure.
+# Customer intent: As an IT admin, I need to be able to order Data Box Disk to upload on-premises data from my server onto Azure.
# Tutorial: Order an Azure Data Box Disk
databox Data Box Disk Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox/data-box-disk-overview.md
Last updated 06/18/2019
-Customer intent: As an IT admin, I need to understand what Data Box Disk is and how it works so I can use it to import on-premises data into Azure.
+# Customer intent: As an IT admin, I need to understand what Data Box Disk is and how it works so I can use it to import on-premises data into Azure.
# What is Azure Data Box Disk?
dms Tutorial Mysql Azure Mysql Online https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-mysql-azure-mysql-online.md
Last updated 01/08/2020
You can use Azure Database Migration Service to migrate the databases from an on-premises MySQL instance to [Azure Database for MySQL](../mysql/index.yml) with minimal downtime. In other words, migration can be achieved with minimum downtime to the application. In this tutorial, you migrate the **Employees** sample database from an on-premises instance of MySQL 5.7 to Azure Database for MySQL by using an online migration activity in Azure Database Migration Service.
+> [!IMPORTANT]
+> The ΓÇ£MySQL to Azure Database for MySQLΓÇ¥ online migration scenario is being replaced with a parallelized, highly performant offline migration scenario on June 1, 2021. For online migrations, you can use this new offering together with [data-in replication](https://docs.microsoft.com/azure/mysql/concepts-data-in-replication). Alternatively, use open-source tools such as [MyDumper/MyLoader](https://centminmod.com/mydumper.html) with data-in replication for online migrations.
+ In this tutorial, you learn how to: > [!div class="checklist"] >
After the initial Full load is completed, the databases are marked **Ready to cu
* For information about known issues and limitations when performing online migrations to Azure Database for MySQL, see the article [Known issues and workarounds with Azure Database for MySQL online migrations](known-issues-azure-mysql-online.md). * For information about Azure Database Migration Service, see the article [What is Azure Database Migration Service?](./dms-overview.md).
-* For information about Azure Database for MySQL, see the article [What is Azure Database for MySQL?](../mysql/overview.md).
+* For information about Azure Database for MySQL, see the article [What is Azure Database for MySQL?](../mysql/overview.md).
dms Tutorial Rds Mysql Server Azure Db For Mysql Online https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-rds-mysql-server-azure-db-for-mysql-online.md
Last updated 06/09/2020
You can use Azure Database Migration Service to migrate databases from an RDS MySQL instance to [Azure Database for MySQL](../mysql/index.yml) while the source database remains online during migration. In other words, migration can be achieved with minimal downtime to the application. In this tutorial, you migrate the **Employees** sample database from an instance of RDS MySQL to Azure Database for MySQL by using the online migration activity in Azure Database Migration Service.
+> [!IMPORTANT]
+> The ΓÇ£RDS MySQL to Azure Database for MySQLΓÇ¥ online migration scenario is being replaced with a parallelized, highly performant offline migration scenario on June 1, 2021. For online migrations, you can use this new offering together with [data-in replication](https://docs.microsoft.com/azure/mysql/concepts-data-in-replication). Alternatively, use open-source tools such as [MyDumper/MyLoader](https://centminmod.com/mydumper.html) with data-in replication for online migrations.
+ In this tutorial, you learn how to: > [!div class="checklist"] >
Your online migration of an on-premises instance of MySQL to Azure Database for
* For information about the Azure Database Migration Service, see the article [What is the Azure Database Migration Service?](./dms-overview.md). * For information about Azure Database for MySQL, see the article [What is Azure Database for MySQL?](../mysql/overview.md).
-* For other questions, email the [Ask Azure Database Migrations](mailto:AskAzureDatabaseMigrations@service.microsoft.com) alias.
+* For other questions, email the [Ask Azure Database Migrations](mailto:AskAzureDatabaseMigrations@service.microsoft.com) alias.
event-grid Delivery And Retry https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/delivery-and-retry.md
This article describes how Azure Event Grid handles events when delivery isn't a
Event Grid provides durable delivery. It delivers each message **at least once** for each subscription. Events are sent to the registered endpoint of each subscription immediately. If an endpoint doesn't acknowledge receipt of an event, Event Grid retries delivery of the event. > [!NOTE]
-> Event Grid doesn't guarantee order for event delivery, so subscriber may receive them out of order.
+> Event Grid doesn't guarantee order for event delivery, so subscribers may receive them out of order.
## Batched event delivery
For more information on using Azure CLI with Event Grid, see [Route storage even
## Retry schedule and duration
-When EventGrid receives an error for an event delivery attempt, EventGrid decides whether it should retry the delivery or dead-letter or drop the event based on the type of the error.
+When EventGrid receives an error for an event delivery attempt, EventGrid decides whether it should retry the delivery, dead-letter the event, or drop the event based on the type of the error.
-If the error returned by the subscribed endpoint is configuration-related error that can't be fixed with retries (for example, if the endpoint is deleted), EventGrid will either perform dead lettering the event or drop the event if dead letter isn't configured.
+If the error returned by the subscribed endpoint is a configuration-related error that can't be fixed with retries (for example, if the endpoint is deleted), EventGrid will either perform dead-lettering on the event or drop the event if dead-letter isn't configured.
-Following are the types of endpoints for which retry doesn't happen:
+The following table describes the types of endpoints and errors for which retry doesn't happen:
| Endpoint Type | Error codes | | --| --|
Following are the types of endpoints for which retry doesn't happen:
| Webhook | 400 Bad Request, 413 Request Entity Too Large, 403 Forbidden, 404 Not Found, 401 Unauthorized | > [!NOTE]
-> If Dead-Letter isn't configured for endpoint, events will be dropped when above errors happen. Consider configuring Dead-Letter, if you don't want these kinds of events to be dropped.
+> If Dead-Letter isn't configured for an endpoint, events will be dropped when the above errors happen. Consider configuring Dead-Letter if you don't want these kinds of events to be dropped.
If the error returned by the subscribed endpoint isn't among the above list, EventGrid performs the retry using policies described below:
If the endpoint responds within 3 minutes, Event Grid will attempt to remove the
Event Grid adds a small randomization to all retry steps and may opportunistically skip certain retries if an endpoint is consistently unhealthy, down for a long period, or appears to be overwhelmed.
-For deterministic behavior, set the event time to live and max delivery attempts in the [subscription retry policies](manage-event-delivery.md).
+For deterministic behavior, set the event time-to-live and max delivery attempts in the [subscription retry policies](manage-event-delivery.md).
By default, Event Grid expires all events that aren't delivered within 24 hours. You can [customize the retry policy](manage-event-delivery.md) when creating an event subscription. You provide the maximum number of delivery attempts (default is 30) and the event time-to-live (default is 1440 minutes).
Event Grid sends an event to the dead-letter location when it has tried all of i
The time-to-live expiration is checked ONLY at the next scheduled delivery attempt. So, even if time-to-live expires before the next scheduled delivery attempt, event expiry is checked only at the time of the next delivery and then subsequently dead-lettered.
-There is a five-minute delay between the last attempt to deliver an event and when it is delivered to the dead-letter location. This delay is intended to reduce the number Blob storage operations. If the dead-letter location is unavailable for four hours, the event is dropped.
+There is a five-minute delay between the last attempt to deliver an event and when it is delivered to the dead-letter location. This delay is intended to reduce the number of Blob storage operations. If the dead-letter location is unavailable for four hours, the event is dropped.
Before setting the dead-letter location, you must have a storage account with a container. You provide the endpoint for this container when creating the event subscription. The endpoint is in the format of: `/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.Storage/storageAccounts/<storage-name>/blobServices/default/containers/<container-name>`
-You might want to be notified when an event has been sent to the dead letter location. To use Event Grid to respond to undelivered events, [create an event subscription](../storage/blobs/storage-blob-event-quickstart.md?toc=%2fazure%2fevent-grid%2ftoc.json) for the dead-letter blob storage. Every time your dead-letter blob storage receives an undelivered event, Event Grid notifies your handler. The handler responds with actions you wish to take for reconciling undelivered events. For an example of setting up a dead letter location and retry policies, see [Dead letter and retry policies](manage-event-delivery.md).
+You might want to be notified when an event has been sent to the dead-letter location. To use Event Grid to respond to undelivered events, [create an event subscription](../storage/blobs/storage-blob-event-quickstart.md?toc=%2fazure%2fevent-grid%2ftoc.json) for the dead-letter blob storage. Every time your dead-letter blob storage receives an undelivered event, Event Grid notifies your handler. The handler responds with actions you wish to take for reconciling undelivered events. For an example of setting up a dead-letter location and retry policies, see [Dead letter and retry policies](manage-event-delivery.md).
## Delivery event formats This section gives you examples of events and dead-lettered events in different delivery schema formats (Event Grid schema, CloudEvents 1.0 schema, and custom schema). For more information about these formats, see [Event Grid schema](event-schema.md) and [Cloud Events 1.0 schema](cloud-event-schema.md) articles.
All other codes not in the above set (200-204) are considered failures and will
| All others | Retry after 10 seconds or more | ## Delivery with custom headers
-Event subscriptions allow you to set up http headers that are included in delivered events. This capability allows you to set custom headers that are required by a destination. You can set up to 10 headers when creating an event subscription. Each header value shouldn't be greater than 4,096 (4K) bytes. You can set custom headers on the events that are delivered to the following destinations:
+Event subscriptions allow you to set up HTTP headers that are included in delivered events. This capability allows you to set custom headers that are required by a destination. You can set up to 10 headers when creating an event subscription. Each header value shouldn't be greater than 4,096 (4K) bytes. You can set custom headers on the events that are delivered to the following destinations:
- Webhooks - Azure Service Bus topics and queues
event-grid Delivery Properties https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/delivery-properties.md
Last updated 03/24/2021
# Delivery with custom headers
-Event subscriptions allow you to set up http headers that are included in delivered events. This capability allows you to set custom headers that are required by a destination. You can set up to 10 headers when creating an event subscription. Each header value shouldn't be greater than 4,096 (4K) bytes.
+Event subscriptions allow you to set up HTTP headers that are included in delivered events. This capability allows you to set custom headers that are required by a destination. You can set up to 10 headers when creating an event subscription. Each header value shouldn't be greater than 4,096 (4K) bytes.
You can set custom headers on the events that are delivered to the following destinations:
You can set custom headers on the events that are delivered to the following des
- Azure Event Hubs - Relay Hybrid Connections
-When creating an event subscription in the Azure portal, you can use the **Delivery Properties** tab to set custom http headers. This page lets you set fixed and dynamic header values.
+When creating an event subscription in the Azure portal, you can use the **Delivery Properties** tab to set custom HTTP headers. This page lets you set fixed and dynamic header values.
## Setting static header values To set headers with a fixed value, provide the name of the header and its value in the corresponding fields:
To set headers with a fixed value, provide the name of the header and its value
You may want check **Is secret?** when providing sensitive data. Sensitive data won't be displayed on the Azure portal. ## Setting dynamic header values
-You can set the value of a header based on a property in an incoming event. Use JsonPath syntax to refer to an incoming eventΓÇÖs property value to be used as the value for a header in outgoing requests. For example, to set the value of a header named **Channel** using the value of the incoming event property **system** in the event data, configure your event subscription in the following way:
+You can set the value of a header based on a property in an incoming event. Use JsonPath syntax to refer to an incoming event's property value to be used as the value for a header in outgoing requests. For example, to set the value of a header named **Channel** using the value of the incoming event property **system** in the event data, configure your event subscription in the following way:
:::image type="content" source="./media/delivery-properties/dynamic-header-property.png" alt-text="Delivery properties - dynamic":::
If you need to publish events to a specific partition within an event hub, defin
### Configure time to live on outgoing events to Azure Storage Queues
-For the Azure Storage Queues destination, you can only configure the time-to-live the outgoing message will have once it has been delivered to an Azure Storage queue. If no time is provided, the messageΓÇÖs default time to live is 7 days. You can also set the event to never expire.
+For the Azure Storage Queues destination, you can only configure the time-to-live the outgoing message will have once it has been delivered to an Azure Storage queue. If no time is provided, the message's default time to live is 7 days. You can also set the event to never expire.
:::image type="content" source="./media/delivery-properties/delivery-properties-storage-queue.png" alt-text="Delivery properties - storage queue":::
event-grid Network Security https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/network-security.md
You can use service tags to define network access controls on [network security
## IP firewall Azure Event Grid supports IP-based access controls for publishing to topics and domains. With IP-based controls, you can limit the publishers to a topic or domain to only a set of approved set of machines and cloud services. This feature complements the [authentication mechanisms](security-authentication.md) supported by Event Grid.
-By default, topic and domain are accessible from internet as long as the request comes with valid authentication and authorization. With IP firewall, you can restrict it further to only a set of IP addresses or IP address ranges in [CIDR (Classless Inter-Domain Routing)](https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing) notation. Publishers originating from any other IP address will be rejected and will receive a 403 (Forbidden) response.
+By default, topic and domain are accessible from the internet as long as the request comes with valid authentication and authorization. With IP firewall, you can restrict it further to only a set of IP addresses or IP address ranges in [CIDR (Classless Inter-Domain Routing)](https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing) notation. Publishers originating from any other IP address will be rejected and will receive a 403 (Forbidden) response.
For step-by-step instructions to configure IP firewall for topics and domains, see [Configure IP firewall](configure-firewall.md).
You can use [private endpoints](../private-link/private-endpoint-overview.md) to
Using private endpoints for your Event Grid resource enables you to: -- Secure access to your topic or domain from a VNet over Microsoft backbone network as opposed to the public internet.
+- Secure access to your topic or domain from a VNet over the Microsoft backbone network as opposed to the public internet.
- Securely connect from on-premises networks that connect to the VNet using VPN or Express Routes with private-peering. When you create a private endpoint for a topic or domain in your VNet, a consent request is sent for approval to the resource owner. If the user requesting the creation of the private endpoint is also an owner of the resource, this consent request is automatically approved. Otherwise, the connection is in **pending** state until approved. Applications in the VNet can connect to the Event Grid service over the private endpoint seamlessly, using the same connection strings and authorization mechanisms that they would use otherwise. Resource owners can manage consent requests and the private endpoints, through the **Private endpoints** tab for the resource in the Azure portal.
You can configure IP firewall for your Event Grid resource to restrict access ov
You can configure private endpoints to restrict access from only from selected virtual networks. For step-by-step instructions, see [Configure private endpoints](configure-private-endpoints.md).
-To troubleshoot network connectivity issues, see [Troubleshoot network connectivity issues](troubleshoot-network-connectivity.md)
+To troubleshoot network connectivity issues, see [Troubleshoot network connectivity issues](troubleshoot-network-connectivity.md)
event-grid Security Authentication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/security-authentication.md
As query parameters could contain client secrets, they are handled with extra ca
For more information on delivering events to webhooks, see [Webhook event delivery](webhook-event-delivery.md) > [!IMPORTANT]
-Azure Event Grid only supports **HTTPS** webhook endpoints.
+> Azure Event Grid only supports **HTTPS** webhook endpoints.
## Endpoint validation with CloudEvents v1.0 If you're already familiar with Event Grid, you might be aware of the endpoint validation handshake for preventing abuse. CloudEvents v1.0 implements its own [abuse protection semantics](webhook-event-delivery.md) by using the **HTTP OPTIONS** method. To read more about it, see [HTTP 1.1 Web Hooks for event delivery - Version 1.0](https://github.com/cloudevents/spec/blob/v1.0/http-webhook.md#4-abuse-protection). When you use the CloudEvents schema for output, Event Grid uses the CloudEvents v1.0 abuse protection in place of the Event Grid validation event mechanism. For more information, see [Use CloudEvents v1.0 schema with Event Grid](cloudevents-schema.md).
event-grid Security Baseline https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/security-baseline.md
# Azure security baseline for Event Grid
-This security
-baseline applies guidance from the [Azure Security Benchmark version
+This security baseline applies guidance from the [Azure Security Benchmark version
1.0](../security/benchmarks/overview-v1.md) to Microsoft Azure Event Grid. The Azure Security Benchmark provides recommendations on how you can secure your cloud solutions on Azure. The content is grouped by the **security controls** defined by the Azure
authentication mechanisms supported by Event Grid.
### 1.2: Monitor and log the configuration and traffic of virtual networks, subnets, and NICs **Guidance**: Use Azure Security Center and follow network protection recommendations to help secure your Event Grid resources in Azure. If using- Azure virtual machines to access your Event Grid resources, enable network security group (NSG) flow logs and send logs into a storage account for traffic audit. - [How to Enable NSG Flow Logs](../network-watcher/network-watcher-nsg-flow-logging-portal.md)
event-grid Webhook Event Delivery https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/webhook-event-delivery.md
To prove endpoint ownership, echo back the validation code in the validationResp
} ```
-You must return an HTTP 200 OK response status code. HTTP 202 Accepted is not recognized as a valid Event Grid subscription validation response. The http request must complete within 30 seconds. If the operation doesn't finish within 30 seconds, then the operation will be canceled and it may be reattempted after 5 seconds. If all the attempts fail, then it will be treated as validation handshake error.
+You must return an HTTP 200 OK response status code. HTTP 202 Accepted is not recognized as a valid Event Grid subscription validation response. The HTTP request must complete within 30 seconds. If the operation doesn't finish within 30 seconds, then the operation will be canceled and it may be reattempted after 5 seconds. If all the attempts fail, then it will be treated as validation handshake error.
Or, you can manually validate the subscription by sending a GET request to the validation URL. The event subscription stays in a pending state until validated. The validation Url uses port 553. If your firewall rules block port 553 then rules may need to be updated for successful manual handshake.
event-hubs Event Hubs Ip Filtering https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/event-hubs-ip-filtering.md
Title: Azure Event Hubs Firewall Rules | Microsoft Docs description: Use Firewall Rules to allow connections from specific IP addresses to Azure Event Hubs. Previously updated : 02/12/2021 Last updated : 03/29/2021 # Allow access to Azure Event Hubs namespaces from specific IP addresses or ranges
By default, Event Hubs namespaces are accessible from internet as long as the re
This feature is helpful in scenarios in which Azure Event Hubs should be only accessible from certain well-known sites. Firewall rules enable you to configure rules to accept traffic originating from specific IPv4 addresses. For example, if you use Event Hubs with [Azure Express Route][express-route], you can create a **firewall rule** to allow traffic from only your on-premises infrastructure IP addresses.
->[!WARNING]
-> Turning on firewall rules for your Event Hubs namespace blocks incoming requests by default, unless requests originate from a service operating from allowed public IP addresses. Requests that are blocked include those from other Azure services, from the Azure portal, from logging and metrics services, and so on. As an exception, you can allow access to Event Hubs resources from certain trusted services even when the IP filtering is enabled. For a list of trusted services, see [Trusted Microsoft services](#trusted-microsoft-services).
+## IP firewall rules
+The IP firewall rules are applied at the Event Hubs namespace level. So, the rules apply to all connections from clients using any supported protocol. Any connection attempt from an IP address that doesn't match an allowed IP rule on the Event Hubs namespace is rejected as unauthorized. The response doesn't mention the IP rule. IP filter rules are applied in order, and the first rule that matches the IP address determines the accept or reject action.
-> [!IMPORTANT]
-> Specify at least one IP rule or virtual network rule for the namespace to allow traffic only from the specified IP addresses or subnet of a virtual network. If there are no IP and virtual network rules, the namespace can be accessed over the public internet (using the access key).
+## Important points
+- This feature is supported for both **standard** and **dedicated** tiers. It's not supported in the **basic** tier.
+- Turning on firewall rules for your Event Hubs namespace blocks incoming requests by default, unless requests originate from a service operating from allowed public IP addresses. Requests that are blocked include those from other Azure services, from the Azure portal, from logging and metrics services, and so on. As an exception, you can allow access to Event Hubs resources from certain **trusted services** even when the IP filtering is enabled. For a list of trusted services, see [Trusted Microsoft services](#trusted-microsoft-services).
+- Specify **at least one IP firewall rule or virtual network rule** for the namespace to allow traffic only from the specified IP addresses or subnet of a virtual network. If there are no IP and virtual network rules, the namespace can be accessed over the public internet (using the access key).
-## IP firewall rules
-The IP firewall rules are applied at the Event Hubs namespace level. So, the rules apply to all connections from clients using any supported protocol. Any connection attempt from an IP address that doesn't match an allowed IP rule on the Event Hubs namespace is rejected as unauthorized. The response doesn't mention the IP rule. IP filter rules are applied in order, and the first rule that matches the IP address determines the accept or reject action.
## Use Azure portal This section shows you how to use the Azure portal to create IP firewall rules for an Event Hubs namespace.
event-hubs Event Hubs Service Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/event-hubs-service-endpoints.md
Title: Virtual Network service endpoints - Azure Event Hubs | Microsoft Docs description: This article provides information on how to add a Microsoft.EventHub service endpoint to a virtual network. Previously updated : 02/12/2021 Last updated : 03/29/2021 # Allow access to Azure Event Hubs namespaces from specific virtual networks
-The integration of Event Hubs with [Virtual Network (VNet) Service Endpoints][vnet-sep] enables secure access to messaging capabilities from workloads such as virtual machines that are bound to virtual networks, with the network traffic path being secured on both ends. Virtual networks are supported in **standard** and **dedicated** tiers of Event Hubs. It's not supported in the **basic** tier.
+The integration of Event Hubs with [Virtual Network (VNet) Service Endpoints][vnet-sep] enables secure access to messaging capabilities from workloads such as virtual machines that are bound to virtual networks, with the network traffic path being secured on both ends.
Once configured to bound to at least one virtual network subnet service endpoint, the respective Event Hubs namespace no longer accepts traffic from anywhere but authorized subnets in virtual networks. From the virtual network perspective, binding an Event Hubs namespace to a service endpoint configures an isolated networking tunnel from the virtual network subnet to the messaging service. The result is a private and isolated relationship between the workloads bound to the subnet and the respective Event Hubs namespace, in spite of the observable network address of the messaging service endpoint being in a public IP range. There's an exception to this behavior. Enabling a service endpoint, by default, enables the `denyall` rule in the [IP firewall](event-hubs-ip-filtering.md) associated with the virtual network. You can add specific IP addresses in the IP firewall to enable access to the Event Hub public endpoint.
->[!WARNING]
-> Enabling virtual networks for your Event Hubs namespace blocks incoming requests by default, unless requests originate from a service operating from allowed virtual networks. Requests that are blocked include those from other Azure services, from the Azure portal, from logging and metrics services, and so on. As an exception, you can allow access to Event Hubs resources from certain trusted services even when virtual networks are enabled. For a list of trusted services, see [Trusted services](#trusted-microsoft-services).
-
-> [!IMPORTANT]
-> Specify at least one IP rule or virtual network rule for the namespace to allow traffic only from the specified IP addresses or subnet of a virtual network. If there are no IP and virtual network rules, the namespace can be accessed over the public internet (using the access key).
+## Important points
+- This feature is supported for both **standard** and **dedicated** tiers. It's not supported in the **basic** tier.
+- Enabling virtual networks for your Event Hubs namespace blocks incoming requests by default, unless requests originate from a service operating from allowed virtual networks. Requests that are blocked include those from other Azure services, from the Azure portal, from logging and metrics services, and so on. As an exception, you can allow access to Event Hubs resources from certain **trusted services** even when virtual networks are enabled. For a list of trusted services, see [Trusted services](#trusted-microsoft-services).
+- Specify **at least one IP rule or virtual network rule** for the namespace to allow traffic only from the specified IP addresses or subnet of a virtual network. If there are no IP and virtual network rules, the namespace can be accessed over the public internet (using the access key).
## Advanced security scenarios enabled by VNet integration
event-hubs Private Link Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/private-link-service.md
A private endpoint is a network interface that connects you privately and secure
For more information, see [What is Azure Private Link?](../private-link/private-link-overview.md)
-> [!WARNING]
-> Enabling private endpoints can prevent other Azure services from interacting with Event Hubs. Requests that are blocked include those from other Azure services, from the Azure portal, from logging and metrics services, and so on. As an exception, you can allow access to Event Hubs resources from certain trusted services even when private endpoints are enabled. For a list of trusted services, see [Trusted services](#trusted-microsoft-services).
-
->[!NOTE]
-> This feature is supported for both **standard** and **dedicated** tiers. It's not supported in the **basic** tier.
+## Important points
+- This feature is supported for both **standard** and **dedicated** tiers. It's not supported in the **basic** tier.
+- Enabling private endpoints can prevent other Azure services from interacting with Event Hubs. Requests that are blocked include those from other Azure services, from the Azure portal, from logging and metrics services, and so on. As an exception, you can allow access to Event Hubs resources from certain **trusted services** even when private endpoints are enabled. For a list of trusted services, see [Trusted services](#trusted-microsoft-services).
+- Specify **at least one IP rule or virtual network rule** for the namespace to allow traffic only from the specified IP addresses or subnet of a virtual network. If there are no IP and virtual network rules, the namespace can be accessed over the public internet (using the access key).
## Add a private endpoint using Azure portal
If you already have an Event Hubs namespace, you can create a private link conne
:::image type="content" source="./media/private-link-service/selected-networks-page.png" alt-text="Networks tab - selected networks option" lightbox="./media/private-link-service/selected-networks-page.png":::
- > [!NOTE]
- > By default, the **Selected networks** option is selected. If you don't specify an IP firewall rule or add a virtual network, the namespace can be accessed via public internet.
+ > [!WARNING]
+ > By default, the **Selected networks** option is selected. If you don't specify an IP firewall rule or add a virtual network, the namespace can be accessed via public internet (using the access key).
1. Select the **Private endpoint connections** tab at the top of the page. 1. Select the **+ Private Endpoint** button at the top of the page.
expressroute Expressroute Locations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/expressroute-locations.md
The following table shows locations by service provider. If you want to view ava
| **[Telehouse - KDDI](https://www.telehouse.net/solutions/cloud-services/cloud-link)** |Supported |Supported |London, London2, Singapore2 | | **Telenor** |Supported |Supported |Amsterdam, London, Oslo | | **[Telia Carrier](https://www.teliacarrier.com/)** | Supported | Supported |Amsterdam, Chicago, Dallas, Frankfurt, Hong Kong, London, Oslo, Paris, Silicon Valley, Stockholm, Washington DC |
-| **[Telin](https://www.telin.net/)** | Supported | Supported |Jakarta |
+| **[Telin](https://www.telin.net/product/data-connectivity/telin-cloud-exchange)** | Supported | Supported |Jakarta |
| **Telmex Uninet**| Supported | Supported | Dallas | | **[Telstra Corporation](https://www.telstra.com.au/business-enterprise/network-services/networks/cloud-direct-connect/)** |Supported |Supported |Melbourne, Singapore, Sydney | | **[Telus](https://www.telus.com)** |Supported |Supported |Montreal, Seattle, Quebec City, Toronto, Vancouver |
firewall-manager Deploy Trusted Security Partner https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall-manager/deploy-trusted-security-partner.md
Remember that a VPN gateway must be deployed to convert an existing hub to secur
## Configure third-party security providers to connect to a secured hub
-To set up tunnels to your virtual hubΓÇÖs VPN Gateway, third-party providers need access rights to your hub. To do this, associate a service principal with your subscription or resource group, and grant access rights. You then must give these credentials to the third-party using their portal.
+To set up tunnels to your virtual hubΓÇÖs VPN Gateway, third-party providers need access rights to your hub. To do this, associate a service principal with your subscription or resource group, and grant access rights. You then must give these credentials to the third party using their portal.
### Create and authorize a service principal
To set up tunnels to your virtual hubΓÇÖs VPN Gateway, third-party providers nee
2. You can look at the tunnel creation status on the Azure Virtual WAN portal in Azure. Once the tunnels show **connected** on both Azure and the partner portal, continue with the next steps to set up routes to select which branches and VNets should send Internet traffic to the partner.
-## Configure route settings
+## Configure security with Firewall Manager
1. Browse to the Azure Firewall Manager -> Secured Hubs.
-2. Select a hub. The Hub status should now show **Provisioned** instead of **Security Connection Pending**.
+2. Select a hub. The hub status should now show **Provisioned** instead of **Security Connection Pending**.
Ensure the third-party provider can connect to the hub. The tunnels on the VPN gateway should be in a **Connected** state. This state is more reflective of the connection health between the hub and the third-party partner, compared to previous status.
-3. Select the hub, and navigate to **Route Settings**.
+3. Select the hub, and navigate to **Security Configurations**.
When you deploy a third-party provider into the hub, it converts the hub into a *secured virtual hub*. This ensures that the third-party provider is advertising a 0.0.0.0/0 (default) route to the hub. However, VNet connections and sites connected to the hub donΓÇÖt get this route unless you opt-in on which connections should get this default route.
-4. Under **Internet traffic**, select **VNet-to-Internet** or **Branch-to-Internet** or both so routes are configured send via the third party.
+4. Configure virtual WAN security by setting **Internet Traffic** via Azure Firewall and **Private Traffic** via a trusted security partner. This automatically secures individual connections in the Virtual WAN.
- This only indicates which type of traffic should be routed to the hub, but it doesnΓÇÖt affect the routes on VNets or branches yet. These routes are not propagated to all VNets/branches attached to the hub by default.
-5. You must select **secure connections** and select the connections on which these routes should be set. This indicates which VNets/branches can start sending Internet traffic to the third-party provider.
-6. From **Route settings**, select **Secure connections** under Internet traffic, then select the VNet or branches (*sites* in Virtual WAN) to be secured. Select **Secure Internet traffic**.
- ![Secure Internet traffic](media/deploy-trusted-security-partner/secure-internet-traffic.png)
-7. Navigate back to the hubs page. The hubΓÇÖs **security partner provider** status should now be **Secured**.
+ :::image type="content" source="media/deploy-trusted-security-partner/security-configuration.png" alt-text="Security configuration":::
+5. Additionally, if your organization uses public IP ranges in virtual networks and branch offices, you need to specify those IP prefixes explicitly using **Private Traffic Prefixes**. The public IP address prefixes can be specified individually or as aggregates.
## Branch or VNet Internet traffic via third-party service Next, you can check if VNet virtual machines or the branch site can access the Internet and validate that the traffic is flowing to the third-party service.
-After finishing the route setting steps, the VNet virtual machines as well as the branch sites are sent a 0/0 to third party service route. You can't RDP or SSH into these virtual machines. To sign in, you can deploy the [Azure Bastion](../bastion/bastion-overview.md) service in a peered VNet.
+After finishing the route setting steps, the VNet virtual machines as well as the branch sites are sent a 0/0 to the third-party service route. You can't RDP or SSH into these virtual machines. To sign in, you can deploy the [Azure Bastion](../bastion/bastion-overview.md) service in a peered VNet.
## Next steps
firewall-manager Trusted Security Partners https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall-manager/trusted-security-partners.md
Previously updated : 03/29/2021 Last updated : 03/30/2021
The following scenarios are supported:
VNet/Branch-to-Internet via a security partner provider and the other traffic (spoke-to-spoke, spoke-to-branch, branch-to-spoke) via Azure Firewall. - Single provider in the hub
- - All traffic (spoke-to-spoke, spoke-to-branch, branch-to-spoke, VNet/Branch-to-Internet) secured by Azure Firewall
+ - All traffic (spoke-to-spoke, spoke-to-branch, branch-to-spoke, VNet/Branch-to-Internet) secured by Azure Firewall<br>
+ or
- VNet/Branch-to-Internet via security partner provider ## Best practices for Internet traffic filtering in secured virtual hubs
firewall Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall/overview.md
Last updated 03/10/2021
-Customer intent: As an administrator, I want to evaluate Azure Firewall so I can determine if I want to use it.
+# Customer intent: As an administrator, I want to evaluate Azure Firewall so I can determine if I want to use it.
# What is Azure Firewall?
firewall Premium Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall/premium-features.md
Previously updated : 03/12/2021 Last updated : 03/30/2021
You're welcome to submit a request at [https://aka.ms/azfw-webcategories-request
Azure Firewall Premium Preview is supported in the following regions: -- West Europe (Public / Europe)-- East US (Public / United States)-- Australia East (Public / Australia)-- Southeast Asia (Public / Asia Pacific)-- UK South (Public / United Kingdom)-- North Europe (Public / Europe)-- East US 2 (Public / United States)-- South Central US (Public / United States)-- West US 2 (Public / United States)-- West US (Public / United States)-- Central US (Public / United States)-- North Central US (Public / United States)-- Japan East (Public / Japan)-- East Asia (Public / Asia Pacific)-- Canada Central (Public / Canada)-- France Central (Public / France)-- South Africa North (Public / South Africa)-- UAE North (Public / UAE)-- Switzerland North (Public / Switzerland)-- Brazil South (Public / Brazil)-- Norway East (Public / Norway) - Australia Central (Public / Australia) - Australia Central 2 (Public / Australia)
+- Australia East (Public / Australia)
- Australia Southeast (Public / Australia)
+- Brazil South (Public / Brazil)
+- Canada Central (Public / Canada)
- Canada East (Public / Canada)
+- Central US (Public / United States)
- Central US EUAP (Public / Canary (US))
+- East Asia (Public / Asia Pacific)
+- East US (Public / United States)
+- East US 2 (Public / United States)
+- France Central (Public / France)
- France South (Public / France)
+- Japan East (Public / Japan)
- Japan West (Public / Japan)
+- Korea Central (Public / Korea)
- Korea South (Public / Korea)
+- North Central US (Public / United States)
+- North Europe (Public / Europe)
+- South Africa North (Public / South Africa)
+- South Central US (Public / United States)
+- Southeast Asia (Public / Asia Pacific)
- UAE Central (Public / UAE)
+- UK South (Public / United Kingdom)
- UK West (Public / United Kingdom) - West Central US (Public / United States)
+- West Europe (Public / Europe)
- West India (Public / India)
+- West US (Public / United States)
+- West US 2 (Public / United States)
## Known issues
frontdoor Front Door Custom Domain Https https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-custom-domain-https.md
ms.devlang: na
Last updated 03/26/2021
-# As a website owner, I want to enable HTTPS on the custom domain in my Front Door so that my users can use my custom domain to access their content securely.
-
+#Customer intent: As a website owner, I want to enable HTTPS on the custom domain in my Front Door so that my users can use my custom domain to access their content securely.
# Tutorial: Configure HTTPS on a Front Door custom domain
To enable HTTPS on a custom domain, follow these steps:
> [!NOTE] > For AFD managed certificates, DigiCertΓÇÖs 64 character limit is enforced. Validation will fail if that limit is exceeded.
+![NOTE] Enabling HTTPS via Front Door managed certificate is not supported for apex/root domains (example: contoso.com). You can use your own certificate for this scenario. Please continue with Option 2 for further details.
### Option 2: Use your own certificate
frontdoor Front Door Custom Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-custom-domain.md
ms.devlang: na
Last updated 09/24/2020
-# As a website owner, I want to add a custom domain to my Front Door configuration so that my users can use my custom domain to access my content.
-
+#Customer intent: As a website owner, I want to add a custom domain to my Front Door configuration so that my users can use my custom domain to access my content.
+ # Tutorial: Add a custom domain to your Front Door+ This tutorial shows how to add a custom domain to your Front Door. When you use Azure Front Door for application delivery, a custom domain is necessary if you would like your own domain name to be visible in your end-user request. Having a visible domain name can be convenient for your customers and useful for branding purposes. After you create a Front Door, the default frontend host, which is a subdomain of `azurefd.net`, is included in the URL for delivering Front Door content from your backend by default (for example, https:\//contoso-frontend.azurefd.net/activeusers.htm). For your convenience, Azure Front Door provides the option of associating a custom domain with the default host. With this option, you deliver your content with a custom domain in your URL instead of a Front Door owned domain name (for example, https:\//www.contoso.com/photo.png).
frontdoor Front Door Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-overview.md
Last updated 03/09/2021
-# customer intent: As an IT admin, I want to learn about Front Door and what I can use it for.
+# Customer intent: As an IT admin, I want to learn about Front Door and what I can use it for.
# What is Azure Front Door?
frontdoor Front Door Rules Engine Actions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-rules-engine-actions.md
na
Last updated 09/29/2020
-# customer intent: As an IT admin, I want to learn about Front Door and what new features are available.
+# Customer intent: As an IT admin, I want to learn about Front Door and what new features are available.
# Azure Front Door Rules Engine Actions
frontdoor Front Door Rules Engine Match Conditions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-rules-engine-match-conditions.md
na
Last updated 03/01/2020
-# customer intent: As an IT admin, I want to learn about Front Door and what new features are available.
+# Customer intent: As an IT admin, I want to learn about Front Door and what new features are available.
# Azure Front Door Rules Engine match conditions
frontdoor Front Door Rules Engine https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-rules-engine.md
na
Last updated 9/29/2020
-# customer intent: As an IT admin, I want to learn about Front Door and what the Rules Engine feature does.
+# Customer intent: As an IT admin, I want to learn about Front Door and what the Rules Engine feature does.
# What is Rules Engine for Azure Front Door?
frontdoor Front Door Security Headers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-security-headers.md
na
Last updated 09/14/2020
-# customer intent: As an IT admin, I want to learn about Front Door and how to configure a security header via Rules Engine.
+# Customer intent: As an IT admin, I want to learn about Front Door and how to configure a security header via Rules Engine.
# Tutorial: Add Security headers with Rules Engine
frontdoor Front Door Tutorial Rules Engine https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-tutorial-rules-engine.md
Last updated 09/09/2020
-# customer intent: As an IT admin, I want to learn about Front Door and how to configure Rules Engine feature via the Azure portal or Azure CLI.
+# Customer intent: As an IT admin, I want to learn about Front Door and how to configure Rules Engine feature via the Azure portal or Azure CLI.
# Tutorial: Configure your Rules Engine
frontdoor Front Door Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-whats-new.md
na
Last updated 4/30/2020
-# customer intent: As an IT admin, I want to learn about Front Door and what new features are available.
+# Customer intent: As an IT admin, I want to learn about Front Door and what new features are available.
# What's new in Azure Front Door?
frontdoor Quickstart Create Front Door Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/quickstart-create-front-door-cli.md
description: This quickstart will show you how to use Azure Front Door to create
-Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
+# Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
ms.assetid: ms.devlang: na
frontdoor Quickstart Create Front Door Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/quickstart-create-front-door-powershell.md
documentationcenter: na
-Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
+# Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
ms.assetid: ms.devlang: na
frontdoor Quickstart Create Front Door https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/quickstart-create-front-door.md
documentationcenter: na
-Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
+# Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
ms.devlang: na
frontdoor Concept Rule Set Actions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/concept-rule-set-actions.md
Previously updated : 02/18/2021 Last updated : 03/31/2021
-# Azure Front Door Standard/Premium Rule Set Actions
+# Azure Front Door Standard/Premium (Preview) Rule Set actions
> [!Note] > This documentation is for Azure Front Door Standard/Premium (Preview). Looking for information on Azure Front Door? View [here](../front-door-overview.md).
-An Azure Front Door [Rule Set](concept-rule-set.md) consist of rules with a combination of match conditions and actions. This article provides a detailed description of the actions you can use in a Rule Set. The action defines the behavior that gets applied to a request type that a match condition(s) identifies. In an Azure Front Door Rule Set, a rule can contain up to five actions. Server variable is supported on all actions.
+An Azure Front Door Standard/Premium [Rule Set](concept-rule-set.md) consist of rules with a combination of match conditions and actions. This article provides a detailed description of the actions you can use in Azure Front Door Standard/Premium Rule Set. The action defines the behavior that gets applied to a request type that a match condition(s) identifies. In an Azure Front Door (Standard/Premium) Rule Set, a rule can contain up to five actions.
> [!IMPORTANT] > Azure Front Door Standard/Premium (Preview) is currently in public preview.
An Azure Front Door [Rule Set](concept-rule-set.md) consist of rules with a comb
The following actions are available to use in Azure Front Door rule set.
-## Cache expiration
+## <a name="CacheExpiration"></a> Cache expiration
-Use this action to overwrite the time to live (TTL) value of the endpoint for requests that the rules match conditions specify.
+Use the **cache expiration** action to overwrite the time to live (TTL) value of the endpoint for requests that the rules match conditions specify.
-### Required fields
+> [!NOTE]
+> Origins may specify not to cache specific responses using the `Cache-Control` header with a value of `no-cache`, `private`, or `no-store`. In these circumstances, Front Door will never cache the content and this action will have no effect.
-The following description applies when selecting these cache behaviors and the rule matches:
+### Properties
-Cache behavior | Description
-|-
-Bypass cache | The content isn't cached.
-Override | The TTL value returned from your origin is overwritten with the value specified in the action. This behavior will only be applied if the response is cacheable. For cache-control response header with values "no-cache", "private", "no-store", the action won't be applicable.
-Set if missing | If no TTL value gets returned from your origin, the rule sets the TTL to the value specified in the action. This behavior will only be applied if the response is cacheable. For cache-control response header with values "no-cache", "private", "no-store", the action won't be applicable.
+| Property | Supported values |
+|-||
+| Cache behavior | <ul><li>**Bypass cache:** The content should not be cached. In ARM templates, set the `cacheBehavior` property to `BypassCache`.</li><li>**Override:** The TTL value returned from your origin is overwritten with the value specified in the action. This behavior will only be applied if the response is cacheable. In ARM templates, set the `cacheBehavior` property to `Override`.</li><li>**Set if missing:** If no TTL value gets returned from your origin, the rule sets the TTL to the value specified in the action. This behavior will only be applied if the response is cacheable. In ARM templates, set the `cacheBehavior` property to `SetIfMissing`.</li></ul> |
+| Cache duration | When _Cache behavior_ is set to `Override` or `Set if missing`, these fields must specify the cache duration to use. The maximum duration is 366 days.<ul><li>In the Azure portal: specify the days, hours, minutes, and seconds.</li><li>In ARM templates: specify the duration in the format `d.hh:mm:ss`. |
-### Additional fields
+### Example
-Days | Hours | Minutes | Seconds
|-||--
-Int | Int | Int | Int
+In this example, we override the cache expiration to 6 hours, for matched requests that don't specify a cache duration already.
-## Cache key query string
+# [Portal](#tab/portal)
-Use this action to modify the cache key based on query strings.
-### Required fields
+# [JSON](#tab/json)
-The following description applies when selecting these behaviors and the rule matches:
+```json
+{
+ "name": "CacheExpiration",
+ "parameters": {
+ "cacheBehavior": "SetIfMissing",
+ "cacheType": "All",
+ "cacheDuration": "0.06:00:00",
+ "@odata.type": "#Microsoft.Azure.Cdn.Models.DeliveryRuleCacheExpirationActionParameters"
+ }
+}
+```
-Behavior | Description
-|
-Include | Query strings specified in the parameters get included when the cache key gets generated.
-Cache every unique URL | Each unique URL has its own cache key.
-Exclude | Query strings specified in the parameters get excluded when the cache key gets generated.
-Ignore query strings | Query strings aren't considered when the cache key gets generated.
+# [Bicep](#tab/bicep)
-## Modify request header
+```bicep
+{
+ name: 'CacheExpiration'
+ parameters: {
+ cacheBehavior: 'SetIfMissing'
+ cacheType: All
+ cacheDuration: '0.06:00:00'
+ '@odata.type': '#Microsoft.Azure.Cdn.Models.DeliveryRuleCacheExpirationActionParameters'
+ }
+}
+```
-Use this action to modify headers that are present in requests sent to your origin.
++
+## <a name="CacheKeyQueryString"></a> Cache key query string
+
+Use the **cache key query string** action to modify the cache key based on query strings. The cache key is the way that Front Door identifies unique requests to cache.
+
+### Properties
-### Required fields
+| Property | Supported values |
+|-||
+| Behavior | <ul><li>**Include:** Query strings specified in the parameters get included when the cache key gets generated. In ARM templates, set the `queryStringBehavior` property to `Include`.</li><li>**Cache every unique URL:** Each unique URL has its own cache key. In ARM templates, use the `queryStringBehavior` of `IncludeAll`.</li><li>**Exclude:** Query strings specified in the parameters get excluded when the cache key gets generated. In ARM templates, set the `queryStringBehavior` property to `Exclude`.</li><li>**Ignore query strings:** Query strings aren't considered when the cache key gets generated. In ARM templates, set the `queryStringBehavior` property to `ExcludeAll`.</li></ul> |
+| Parameters | The list of query string parameter names, separated by commas. |
-The following description applies when selecting these actions and the rule matches:
+### Example
-Action | HTTP header name | Value
--||
-Append | The header specified in **Header name** gets added to the request with the specified value. If the header is already present, the value is appended to the existing value. | String
-Overwrite | The header specified in **Header name** gets added to the request with the specified value. If the header is already present, the specified value overwrites the existing value. | String
-Delete | If the header specified in the rule is present, the header gets deleted from the request. | String
+In this example, we modify the cache key to include a query string parameter named `customerId`.
-## Modify response header
+# [Portal](#tab/portal)
-Use this action to modify headers that are present in responses returned to your clients.
-### Required fields
+# [JSON](#tab/json)
-The following description applies when selecting these actions and the rule matches:
+```json
+{
+ "name": "CacheKeyQueryString",
+ "parameters": {
+ "queryStringBehavior": "Include",
+ "queryParameters": "customerId",
+ "@odata.type": "#Microsoft.Azure.Cdn.Models.DeliveryRuleCacheKeyQueryStringBehaviorActionParameters"
+ }
+}
+```
-Action | HTTP Header name | Value
--||
-Append | The header specified in **Header name** gets added to the response by using the specified **Value**. If the header is already present, **Value** is appended to the existing value. | String
-Overwrite | The header specified in **Header name** gets added to the response by using the specified **Value**. If the header is already present, **Value** overwrites the existing value. | String
-Delete | If the header specified in the rule is present, the header gets deleted from the response. | String
+# [Bicep](#tab/bicep)
+
+```bicep
+{
+ name: 'CacheKeyQueryString'
+ parameters: {
+ queryStringBehavior: 'Include'
+ queryParameters: 'customerId'
+ '@odata.type': '#Microsoft.Azure.Cdn.Models.DeliveryRuleCacheKeyQueryStringBehaviorActionParameters'
+ }
+}
+```
++
-## URL redirect
+## <a name="ModifyRequestHeader"></a> Modify request header
-Use this action to redirect clients to a new URL.
+Use the **modify request header** action to modify the headers in the request when it is sent to your origin.
-### Required fields
+### Properties
-Field | Description
-|
-Redirect Type | Select the response type to return to the requestor: Found (302), Moved (301), Temporary redirect (307), and Permanent redirect (308).
-Redirect protocol | Match Request, HTTP, HTTPS.
-Destination host | Select the host name you want the request to be redirected to. Leave blank to preserve the incoming host.
-Destination path | Define the path to use in the redirect. Leave blank to preserve the incoming path.
-Query string | Define the query string used in the redirect. Leave blank to preserve the incoming query string.
-Destination fragment | Define the fragment to use in the redirect. Leave blank to preserve the incoming fragment.
+| Property | Supported values |
+|-||
+| Operator | <ul><li>**Append:** The specified header gets added to the request with the specified value. If the header is already present, the value is appended to the existing header value using string concatenation. No delimiters are added. In ARM templates, use the `headerAction` of `Append`.</li><li>**Overwrite:** The specified header gets added to the request with the specified value. If the header is already present, the specified value overwrites the existing value. In ARM templates, use the `headerAction` of `Overwrite`.</li><li>**Delete:** If the header specified in the rule is present, the header gets deleted from the request. In ARM templates, use the `headerAction` of `Delete`.</li></ul> |
+| Header name | The name of the header to modify. |
+| Header value | The value to append or overwrite. |
-## URL rewrite
+### Example
-Use this action to rewrite the path of a request that's en route to your origin.
+In this example, we append the value `AdditionalValue` to the `MyRequestHeader` request header. If the origin set the response header to a value of `ValueSetByClient`, then after this action is applied, the request header would have a value of `ValueSetByClientAdditionalValue`.
-### Required fields
+# [Portal](#tab/portal)
-Field | Description
-|
-Source pattern | Define the source pattern in the URL path to replace. Currently, source pattern uses a prefix-based match. To match all URL paths, use a forward slash (**/**) as the source pattern value.
-Destination | Define the destination path to use in the rewrite. The destination path overwrites the source pattern.
-Preserve unmatched path | If set to **Yes**, the remaining path after the source pattern is appended to the new destination path.
+
+# [JSON](#tab/json)
+
+```json
+{
+ "name": "ModifyRequestHeader",
+ "parameters": {
+ "headerAction": "Append",
+ "headerName": "MyRequestHeader",
+ "value": "AdditionalValue",
+ "@odata.type": "#Microsoft.Azure.Cdn.Models.DeliveryRuleHeaderActionParameters"
+ }
+}
+```
+
+# [Bicep](#tab/bicep)
+
+```bicep
+{
+ name: 'ModifyRequestHeader'
+ parameters: {
+ headerAction: 'Append'
+ headerName: 'MyRequestHeader'
+ value: 'AdditionalValue'
+ '@odata.type': '#Microsoft.Azure.Cdn.Models.DeliveryRuleHeaderActionParameters'
+ }
+}
+```
++
-## Server Variable
+## <a name="ModifyResponseHeader"></a> Modify response header
-### Supported Variables
+Use the **modify response header** action to modify headers that are present in responses before they are returned to your clients.
-| Variable name | Description |
-| -- | :-- |
-| socket_ip | The IP address of the direct connection to Azure Front Door edge. If the client used an HTTP proxy or a load balancer to send the request, the value of SocketIp is the IP address of the proxy or load balancer. |
-| client_ip | The IP address of the client that made the original request. If there was an X-Forwarded-For header in the request, then the Client IP is picked from the same. |
-| client_port | The IP port of the client that made the request. |
-| hostname | The host name in the request from client. |
-| geo_country | Indicates the requester's country/region of origin through its country/region code. |
-| http_method | The method used to make the URL request. For example, GET or POST. |
-| http_version | The request protocol. Usually HTTP/1.0, HTTP/1.1, or HTTP/2.0. |
-| query_string | The list of variable/value pairs that follows the "?" in the requested URL. Example: in the request *http://contoso.com:8080/article.aspx?id=123&title=fabrikam*, query_string value will be *id=123&title=fabrikam* |
-| request_scheme | The request scheme: http or https. |
-| request_uri | The full original request URI (with arguments). Example: in the request *http://contoso.com:8080/article.aspx?id=123&title=fabrikam*, request_uri value will be */article.aspx?id=123&title=fabrikam* |
-| server_port | The port of the server that accepted a request. |
-| ssl_protocol | The protocol of an established TLS connection. |
-| url_path | Identifies the specific resource in the host that the web client wants to access. This is the part of the request URI without the arguments. Example: in the request *http://contoso.com:8080/article.aspx?id=123&title=fabrikam*, uri_path value will be */article.aspx* |
+### Properties
-### Server Variable Format
+| Property | Supported values |
+|-||
+| Operator | <ul><li>**Append:** The specified header gets added to the response with the specified value. If the header is already present, the value is appended to the existing header value using string concatenation. No delimiters are added. In ARM templates, use the `headerAction` of `Append`.</li><li>**Overwrite:** The specified header gets added to the response with the specified value. If the header is already present, the specified value overwrites the existing value. In ARM templates, use the `headerAction` of `Overwrite`.</li><li>**Delete:** If the header specified in the rule is present, the header gets deleted from the response. In ARM templates, use the `headerAction` of `Delete`.</li></ul> |
+| Header name | The name of the header to modify. |
+| Header value | The value to append or overwrite. |
-**Format:** {variable:offset}, {variable:offset:length}, {variable}
+### Example
+
+In this example, we delete the header with the name `X-Powered-By` from the responses before they are returned to the client.
+
+# [Portal](#tab/portal)
++
+# [JSON](#tab/json)
+
+```json
+{
+ "name": "ModifyResponseHeader",
+ "parameters": {
+ "headerAction": "Delete",
+ "headerName": "X-Powered-By",
+ "@odata.type": "#Microsoft.Azure.Cdn.Models.DeliveryRuleHeaderActionParameters"
+ }
+}
+```
+
+# [Bicep](#tab/bicep)
+
+```bicep
+{
+ name: 'ModifyResponseHeader'
+ parameters: {
+ headerAction: 'Delete'
+ headerName: 'X-Powered-By'
+ '@odata.type': '#Microsoft.Azure.Cdn.Models.DeliveryRuleHeaderActionParameters'
+ }
+}
+```
+++
+## <a name="UrlRedirect"></a> URL redirect
+
+Use the **URL redirect** action to redirect clients to a new URL. Clients are sent a redirection response from Front Door.
+
+### Properties
+
+| Property | Supported values |
+|-||
+| Redirect type | The response type to return to the requestor. <ul><li>In the Azure portal: Found (302), Moved (301), Temporary Redirect (307), Permanent Redirect (308).</li><li>In ARM templates: `Found`, `Moved`, `TemporaryRedirect`, `PermanentRedirect`</li></ul> |
+| Redirect protocol | <ul><li>In the Azure portal: `Match Request`, `HTTP`, `HTTPS`</li><li>In ARM templates: `MatchRequest`, `Http`, `Https`</li></ul> |
+| Destination host | The host name you want the request to be redirected to. Leave blank to preserve the incoming host. |
+| Destination path | The path to use in the redirect. Include the leading `/`. Leave blank to preserve the incoming path. |
+| Query string | The query string used in the redirect. Don't include the leading `?`. Leave blank to preserve the incoming query string. |
+| Destination fragment | The fragment to use in the redirect. Leave blank to preserve the incoming fragment. |
+
+### Example
+
+In this example, we redirect the request to `https://contoso.com/exampleredirection?clientIp={client_ip}`, while preserving the fragment. An HTTP Temporary Redirect (307) is used. The IP address of the client is used in place of the `{client_ip}` token within the URL by using the `client_ip` [server variable](#server-variables).
+
+# [Portal](#tab/portal)
++
+# [JSON](#tab/json)
+
+```json
+{
+ "name": "UrlRedirect",
+ "parameters": {
+ "redirectType": "TemporaryRedirect",
+ "destinationProtocol": "Https",
+ "customHostname": "contoso.com",
+ "customPath": "/exampleredirection",
+ "customQueryString": "clientIp={client_ip}",
+ "@odata.type": "#Microsoft.Azure.Cdn.Models.DeliveryRuleUrlRedirectActionParameters"
+ }
+}
+```
+
+# [Bicep](#tab/bicep)
+
+```bicep
+{
+ name: 'UrlRedirect'
+ parameters: {
+ redirectType: 'TemporaryRedirect'
+ destinationProtocol: 'Https'
+ customHostname: 'contoso.com'
+ customPath: '/exampleredirection'
+ customQueryString: 'clientIp={client_ip}'
+ '@odata.type': '#Microsoft.Azure.Cdn.Models.DeliveryRuleUrlRedirectActionParameters'
+ }
+}
+```
++
-### Supported server variable actions
+## <a name="UrlRewrite"></a> URL rewrite
+
+Use the **URL rewrite** action to rewrite the path of a request that's en route to your origin.
+
+### Properties
+
+| Property | Supported values |
+|-||
+| Source pattern | Define the source pattern in the URL path to replace. Currently, source pattern uses a prefix-based match. To match all URL paths, use a forward slash (`/`) as the source pattern value. |
+| Destination | Define the destination path to use in the rewrite. The destination path overwrites the source pattern. |
+| Preserve unmatched path | If set to _Yes_, the remaining path after the source pattern is appended to the new destination path. |
+
+### Example
+
+In this example, we rewrite all requests to the path `/redirection`, and don't preserve the remainder of the path.
+
+# [Portal](#tab/portal)
++
+# [JSON](#tab/json)
+
+```json
+{
+ "name": "UrlRewrite",
+ "parameters": {
+ "sourcePattern": "/",
+ "destination": "/redirection",
+ "preserveUnmatchedPath": false,
+ "@odata.type": "#Microsoft.Azure.Cdn.Models.DeliveryRuleUrlRewriteActionParameters"
+ }
+}
+```
+
+# [Bicep](#tab/bicep)
+
+```bicep
+{
+ name: 'UrlRewrite'
+ parameters: {
+ sourcePattern: '/'
+ destination: '/redirection'
+ preserveUnmatchedPath: false
+ '@odata.type': '#Microsoft.Azure.Cdn.Models.DeliveryRuleUrlRewriteActionParameters'
+ }
+}
+```
+++
+## Server variables
+
+Rule Set server variables provide access to structured information about the request. You can use server variables to dynamically change the request/response headers or URL rewrite paths/query strings, for example, when a new page load or when a form is posted.
+
+### Supported variables
+
+| Variable name | Description |
+||-|
+| `socket_ip` | The IP address of the direct connection to Azure Front Door edge. If the client used an HTTP proxy or a load balancer to send the request, the value of `socket_ip` is the IP address of the proxy or load balancer. |
+| `client_ip` | The IP address of the client that made the original request. If there was an `X-Forwarded-For` header in the request, then the client IP address is picked from the header. |
+| `client_port` | The IP port of the client that made the request. |
+| `hostname` | The host name in the request from the client. |
+| `geo_country` | Indicates the requester's country/region of origin through its country/region code. |
+| `http_method` | The method used to make the URL request, such as `GET` or `POST`. |
+| `http_version` | The request protocol. Usually `HTTP/1.0`, `HTTP/1.1`, or `HTTP/2.0`. |
+| `query_string` | The list of variable/value pairs that follows the "?" in the requested URL.<br />For example, in the request `http://contoso.com:8080/article.aspx?id=123&title=fabrikam`, the `query_string` value will be `id=123&title=fabrikam`. |
+| `request_scheme` | The request scheme: `http` or `https`. |
+| `request_uri` | The full original request URI (with arguments).<br />For example, in the request `http://contoso.com:8080/article.aspx?id=123&title=fabrikam`, the `request_uri` value will be `/article.aspx?id=123&title=fabrikam`. |
+| `ssl_protocol` | The protocol of an established TLS connection. |
+| `server_port` | The port of the server that accepted a request. |
+| `url_path` | Identifies the specific resource in the host that the web client wants to access. This is the part of the request URI without the arguments.<br />For example, in the request `http://contoso.com:8080/article.aspx?id=123&title=fabrikam`, the `uri_path` value will be `/article.aspx`. |
+
+### Server variable format
+
+Server variables can be specified using the following formats:
+
+* `{variable}`: Include the entire server variable. For example, if the client IP address is `111.222.333.444` then the `{client_ip}` token would evaluate to `111.222.333.444`.
+* `{variable:offset}`: Include the server variable after a specific offset, until the end of the variable. The offset is zero-based. For example, if the client IP address is `111.222.333.444` then the `{client_ip:3}` token would evaluate to `.222.333.444`.
+* `{variable:offset:length}`: Include the server variable after a specific offset, up to the specified length. The offset is zero-based. For example, if the client IP address is `111.222.333.444` then the `{client_ip:4:3}` token would evaluate to `222`.
+
+### Supported actions
+
+Server variables are supported on the following actions:
-* Request header
-* Response header
* Cache key query string
-* URL rewrite
+* Modify request header
+* Modify response header
* URL redirect
+* URL rewrite
## Next steps
-* Learn more about [Azure Front Door Stanard/Premium Rule Set](concept-rule-set.md).
-* Learn more about [Rule Set Match Conditions](concept-rule-set-match-conditions.md).
+* Learn more about [Azure Front Door Standard/Premium Rule Set](concept-rule-set.md).
+* Learn more about [Rule Set match conditions](concept-rule-set-match-conditions.md).
frontdoor Concept Rule Set Match Conditions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/concept-rule-set-match-conditions.md
Previously updated : 03/24/2021 Last updated : 03/31/2021
In this example, we match all requests where the request uses the `HTTP` protoco
## <a name="RequestUrl"></a> Request URL
-Identifies requests that match the specified URL. The entire URL is evaluated. You can specify multiple values to match, which will be combined using OR logic.
+Identifies requests that match the specified URL. The entire URL is evaluated, including the protocol and query string, but not the fragment. You can specify multiple values to match, which will be combined using OR logic.
> [!TIP] > When you use this rule condition, be sure to include the protocol. For example, use `https://www.contoso.com` instead of just `www.contoso.com`.
Regular expressions don't support the following operations:
* Callouts and embedded code. * Atomic grouping and possessive quantifiers.
-## ARM template support
-
-Rule sets can be configured using Azure Resource Manager templates. [See an example template](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-rule-set). You can add match conditions by using the JSON or Bicep snippets included in the examples above.
- ## Next steps * Learn more about [Rule Set](concept-rule-set.md).
frontdoor Concept Rule Set https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/concept-rule-set.md
Previously updated : 02/18/2021 Last updated : 03/31/2021
> [!Note] > This documentation is for Azure Front Door Standard/Premium (Preview). Looking for information on Azure Front Door? View [here](../front-door-overview.md).
-A Rule Set is a customized rule engine that groups a combination of rules into a single set that you can associate with multiple routes. The Rule Set allows you to customize how requests get processed at the edge and how Azure Front Door handles those requests.
+A Rule Set is a customized rule engine that groups a combination of rules into a single set. You can associate a Rule Set with multiple routes. The Rule Set allows you to customize how requests get processed at the edge, and how Azure Front Door handles those requests.
> [!IMPORTANT] > Azure Front Door Standard/Premium (Preview) is currently in public preview.
A Rule Set is a customized rule engine that groups a combination of rules into a
* Route requests to mobile or desktop versions of your application based on the client device type.
-* Using redirect capabilities to return 301, 302, 307, and 308 redirects to the client to direct them to new hostnames, paths, query string, or protocols.
+* Using redirect capabilities to return 301, 302, 307, and 308 redirects to the client to direct them to new hostnames, paths, query strings, or protocols.
* Dynamically modify the caching configuration of your route based on the incoming requests.
With Azure Front Door Rule Set, you can create a combination of Rules Set config
For more quota limit, refer to [Azure subscription and service limits, quotas and constraints](../../azure-resource-manager/management/azure-subscription-service-limits.md).
-* *Rules set*: A set of rules that gets associated to one or multiple [Routes](concept-route.md). Each configuration is limited to 25 rules. You can create up to 10 configurations.
+* *Rule Set*: A set of rules that gets associated to one or multiple [routes](concept-route.md).
-* *Rules Set Rule*: A rule composed of up to 10 match conditions and 5 actions. Rules are local to a Rule Set and cannot be exported to use across Rule Sets. Users can create the same rule in multiple Rule Sets.
+* *Rule Set rule*: A rule composed of up to 10 match conditions and 5 actions. Rules are local to a Rule Set and cannot be exported to use across Rule Sets. Users can create the same rule in multiple Rule Sets.
-* *Match Condition*: There are many match conditions that can be utilized to parse your incoming requests. A rule can contain up to 10 match conditions. Match conditions are evaluated with an **AND** operator. *Regular expression is supported in conditions*. A full list of match conditions can be found in [Rule Set Condition](concept-rule-set-match-conditions.md).
+* *Match condition*: There are many match conditions that can be utilized to parse your incoming requests. A rule can contain up to 10 match conditions. Match conditions are evaluated with an **AND** operator. *Regular expression is supported in conditions*. A full list of match conditions can be found in [Rule Set match conditions](concept-rule-set-match-conditions.md).
-* *Action*: Actions dictate how AFD handles the incoming requests based on the matching conditions. You can modify caching behaviors, modify request headers/response headers, do URL rewrite and URL redirection. *Server variables are supported on Action*. A rule can contain up to 10 match conditions. A full list of actions can be found [Rule Set Actions](concept-rule-set-actions.md).
+* *Action*: Actions dictate how AFD handles the incoming requests based on the matching conditions. You can modify caching behaviors, modify request headers/response headers, do URL rewrite and URL redirection. *Server variables are supported on Action*. A rule can contain up to 10 match conditions. A full list of actions can be found [Rule Set actions](concept-rule-set-actions.md).
+
+## ARM template support
+
+Rule Sets can be configured using Azure Resource Manager templates. [See an example template](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-rule-set). You can customize the behavior by using the JSON or Bicep snippets included in the documentation examples for [match conditions](concept-rule-set-match-conditions.md) and [actions](concept-rule-set-actions.md).
## Next steps * Learn how to [create a Front Door Standard/Premium](create-front-door-portal.md).
-* Learn how to configure your first [Rule Set](how-to-configure-rule-set.md).
-
+* Learn how to configure your first [Rule Set](how-to-configure-rule-set.md).
frontdoor Create Front Door Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/create-front-door-portal.md
description: This quickstart shows how to use Azure Front Door Standard/Premium
-Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
+# Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
ms.devlang: na
frontdoor How To Configure Https Custom Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/how-to-configure-https-custom-domain.md
Last updated 02/18/2021
-# As a website owner, I want to add a custom domain to my Front Door configuration so that my users can use my custom domain to access my content.
+#Customer intent: As a website owner, I want to add a custom domain to my Front Door configuration so that my users can use my custom domain to access my content.
# Configure HTTPS on a Front Door Standard/Premium SKU (Preview) custom domain using the Azure portal
hdinsight Hdinsight Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hdinsight-release-notes.md
Title: Release notes for Azure HDInsight description: Latest release notes for Azure HDInsight. Get development tips and details for Hadoop, Spark, R Server, Hive, and more.-+ Last updated 03/23/2021
HDInsight added [Spark 3.0.0](https://spark.apache.org/docs/3.0.0/) support to H
### Kafka 2.4 preview HDInsight added [Kafka 2.4.1](http://kafka.apache.org/24/documentation.html) support to HDInsight 4.0 as a Preview feature.
+### Eav4-series support
+HDInsight added Eav4-series support in this release. Learn more about [Dav4-series here](../virtual-machines/eav4-easv4-series.md). The series has been made available in the following regions:
+
+* Australia East
+* Brazil South
+* Central US
+* East Asia
+* East US
+* Japan East
+* Southeast Asia
+* UK South
+* West Europe
+* West US 2
+ ### Moving to Azure virtual machine scale sets HDInsight now uses Azure virtual machines to provision the cluster. The service is gradually migrating to [Azure virtual machine scale sets](../virtual-machine-scale-sets/overview.md). The entire process may take months. After your regions and subscriptions are migrated, newly created HDInsight clusters will run on virtual machine scale sets without customer actions. No breaking change is expected.
The following changes will happen in upcoming releases.
### OS version upgrade HDInsight will be upgrading OS version from Ubuntu 16.04 to 18.04. The upgrade will complete before April 2021.
-### HDInsight 3.6 end of support on June 30 2021
-HDInsight 3.6 will be end of support. Starting form June 30 2021, customers can't create new HDInsight 3.6 clusters. Existing clusters will run as is without the support from Microsoft. Consider moving to HDInsight 4.0 to avoid potential system/support interruption.
+### Basic support for HDInsight 3.6 starting July 1, 2021
+Starting July 1, 2021, Microsoft will offer [Basic support](hdinsight-component-versioning.md#support-options-for-hdinsight-versions) for certain HDInsight 3.6 cluster types. The Basic support plan will be available until 3 April 2022. You'll automatically be enrolled in Basic support starting July 1, 2021. No action is required by you to opt in. See [our documentation](hdinsight-36-component-versioning.md) for which cluster types are included under Basic support.
+
+We don't recommend building any new solutions on HDInsight 3.6, freeze changes on existing 3.6 environments. We recommend that you [migrate your clusters to HDInsight 4.0](hdinsight-version-release.md#how-to-upgrade-to-hdinsight-40). Learn more about [what's new in HDInsight 4.0](hdinsight-version-release.md#whats-new-in-hdinsight-40).
## Bug fixes HDInsight continues to make cluster reliability and performance improvements.
HDInsight continues to make cluster reliability and performance improvements.
## Component version change Added support for Spark 3.0.0 and Kafka 2.4.1 as Preview. You can find the current component versions for HDInsight 4.0 and HDInsight 3.6 in [this doc](./hdinsight-component-versioning.md).+
+## Recommanded features
+### Service tags
+Service tags simplify restricting network access to the Azure services for Azure virtual machines and Azure virtual networks. Service tags in your network security group (NSG) rules allow or deny traffic to a specific Azure service. The rule can be set globally or per Azure region. Azure provides the maintenance of IP addresses underlying each tag. HDInsight service tags for network security groups (NSGs) are groups of IP addresses for health and management services. These groups help minimize complexity for security rule creation. HDInsight customers can enable service tag through Azure portal, PowerShell, and REST API. For more information, see [Network security group (NSG) service tags for Azure HDInsight](./hdinsight-service-tags.md).
hdinsight Apache Spark Create Standalone Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/apache-spark-create-standalone-application.md
Last updated 08/21/2020
-#customer intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to create a Scala Maven application for Spark in HDInsight using IntelliJ.
+# Customer intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to create a Scala Maven application for Spark in HDInsight using IntelliJ.
# Tutorial: Create a Scala Maven application for Apache Spark in HDInsight using IntelliJ
hdinsight Apache Spark Ipython Notebook Machine Learning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/apache-spark-ipython-notebook-machine-learning.md
Last updated 04/07/2020
-#customer intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to create a simple machine learning Spark application.
+# Customer intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to create a simple machine learning Spark application.
# Tutorial: Build an Apache Spark machine learning application in Azure HDInsight
hdinsight Apache Spark Load Data Run Query https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/apache-spark-load-data-run-query.md
Last updated 02/12/2020-
-#custom intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to load data into a Spark cluster, so I can run interactive SQL queries against the data.
+# Customer intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to load data into a Spark cluster, so I can run interactive SQL queries against the data.
# Tutorial: Load data and run queries on an Apache Spark cluster in Azure HDInsight
hdinsight Apache Spark Manage Dependencies https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/apache-spark-manage-dependencies.md
Last updated 09/09/2020
-#customer intent: As a developer for Apache Spark and Apache Spark in Azure HDInsight, I want to learn how to manage my Spark application dependencies and install packages on my HDInsight cluster.
+# Customer intent: As a developer for Apache Spark and Apache Spark in Azure HDInsight, I want to learn how to manage my Spark application dependencies and install packages on my HDInsight cluster.
# Manage Spark application dependencies
hdinsight Apache Spark Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/apache-spark-overview.md
Last updated 09/21/2020
-#customer intent: As a developer new to Apache Spark and Apache Spark in Azure HDInsight, I want to have a basic understanding of Microsoft's implementation of Apache Spark in Azure HDInsight so I can decide if I want to use it rather than build my own cluster.
+# Customer intent: As a developer new to Apache Spark and Apache Spark in Azure HDInsight, I want to have a basic understanding of Microsoft's implementation of Apache Spark in Azure HDInsight so I can decide if I want to use it rather than build my own cluster.
# What is Apache Spark in Azure HDInsight
hdinsight Apache Spark Use Bi Tools https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/apache-spark-use-bi-tools.md
Last updated 04/21/2020-
-#custom intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to virtualize Spark data in BI tools.
+#Customer intent: As a developer new to Apache Spark and to Apache Spark in Azure HDInsight, I want to learn how to virtualize Spark data in BI tools.
# Tutorial: Analyze Apache Spark data using Power BI in HDInsight
hdinsight Spark Cruise https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/spark/spark-cruise.md
Last updated 07/27/2020
-#customer intent: As an Apache Spark developer, I would like to learn about the tools and features to optimize my Spark workloads on Azure HDInsight.
+# Customer intent: As an Apache Spark developer, I would like to learn about the tools and features to optimize my Spark workloads on Azure HDInsight.
# SparkCruise on Azure HDInsight
iot-accelerators About Iot Accelerators https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/about-iot-accelerators.md
--
-# Intent: As a developer or IT Pro, I need to know what the IoT solution accelerators do, so I can understand if they can help me to build and manage my IoT solution.
+# Customer intent: As a developer or IT Pro, I need to know what the IoT solution accelerators do, so I can understand if they can help me to build and manage my IoT solution.
# What are Azure IoT solution accelerators?
iot-accelerators Iot Accelerators Device Simulation Advanced Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/iot-accelerators-device-simulation-advanced-device.md
Last updated 03/18/2019
-# As an IT Pro, I need to create advanced custom simulated devices to test my IoT solution.
+# Customer intent: As an IT Pro, I need to create advanced custom simulated devices to test my IoT solution.
# Create an advanced device model
iot-accelerators Iot Accelerators Device Simulation Create Custom Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/iot-accelerators-device-simulation-create-custom-device.md
Last updated 10/25/2018
-# As an IT Pro, I need to create simulated devices to test my IoT solution.
+#Customer intent: As an IT Pro, I need to create simulated devices to test my IoT solution.
# Tutorial: Create a custom simulated device
iot-accelerators Iot Accelerators Device Simulation Create Simulation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/iot-accelerators-device-simulation-create-simulation.md
Last updated 03/08/2019
-# As an IT Pro, I need to create simulated devices to test my IoT solution.
+#Customer intent: As an IT Pro, I need to create simulated devices to test my IoT solution.
# Tutorial: Create and run an IoT device simulation
iot-accelerators Iot Accelerators Device Simulation Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/iot-accelerators-device-simulation-overview.md
Last updated 12/03/2018
-# As a developer or IT Pro, I want to understand what the Device Simulation solution accelerator is so that I can understand if it can help me test my IoT solution.
+#Customer intent: As a developer or IT Pro, I want to understand what the Device Simulation solution accelerator is so that I can understand if it can help me test my IoT solution.
# Device Simulation solution accelerator overview
iot-accelerators Iot Accelerators Device Simulation Protobuf https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/iot-accelerators-device-simulation-protobuf.md
Last updated 11/06/2018
-# As an IT Pro, I need to create advanced custom simulated devices to test my IoT solution.
+#Customer intent: As an IT Pro, I need to create advanced custom simulated devices to test my IoT solution.
# Serialize telemetry using Protocol Buffers
iot-accelerators Iot Accelerators Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/iot-accelerators-permissions.md
Last updated 12/13/2018
-# As a developer or IT Pro, I want to deploy and manage a solution accelerator from a web site to quickly get a demo or production environment up and running
+#Customer intent: As a developer or IT Pro, I want to deploy and manage a solution accelerator from a web site to quickly get a demo or production environment up and running.
# Use the azureiotsolutions.com site to deploy your solution accelerator
iot-accelerators Quickstart Connected Factory Deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-accelerators/quickstart-connected-factory-deploy.md
Last updated 03/08/2019
-# As an IT Pro, I want to try out a cloud-based solution to understand how I can monitor and manage my industrial IoT devices.
+#Customer intent: As an IT Pro, I want to try out a cloud-based solution to understand how I can monitor and manage my industrial IoT devices.
# Quickstart: Try a cloud-based solution to manage my industrial IoT devices
iot-central Howto Customize Ui https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/howto-customize-ui.md
-# As an administrator, I want to customize the themes and help links within Central so that my company's brand is represented within the app.
-
+#Customer intent: As an administrator, I want to customize the themes and help links within Central so that my company's brand is represented within the app.
# Customize the Azure IoT Central UI
iot-central Howto Monitor Application Health https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/howto-monitor-application-health.md
-# As an operator, I want to monitor the overall health of the devices and data exports in my IoT Central application.
+#Customer intent: As an operator, I want to monitor the overall health of the devices and data exports in my IoT Central application.
# Monitor the overall health of an IoT Central application
iot-central Howto Use App Templates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/howto-use-app-templates.md
-# As a solution manager, I want to have one or more application templates available in my library that I can use when deploying to new organizations.
+#Customer intent: As a solution manager, I want to have one or more application templates available in my library that I can use when deploying to new organizations.
# Export your application
iot-central Quick Create Simulated Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/quick-create-simulated-device.md
-# As a builder, I want to try out creating a device template and adding a simulated device to my IoT Central application.
+#Customer intent: As a builder, I want to try out creating a device template and adding a simulated device to my IoT Central application.
# Quickstart: Add a simulated device to your IoT Central application
iot-central Troubleshoot Connection https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/troubleshoot-connection.md
-# As a device developer, I want to understand why data from my devices isn't showing up in IoT Central, and the steps I can take to rectify the issue.
+#Customer intent: As a device developer, I want to understand why data from my devices isn't showing up in IoT Central, and the steps I can take to rectify the issue.
# Troubleshoot why data from your devices isn't showing up in Azure IoT Central
iot-central Tutorial Connect Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-central/core/tutorial-connect-device.md
zone_pivot_groups: programming-languages-set-twenty-six
# - id: programming-language-python # Title: Python
-# As a device developer, I want to try out using device code that uses one of the the Azure IoT device SDKs. I want to understand how to send telemetry from a device, synchronize properties with the device, and control the device using commands.
+#Customer intent: As a device developer, I want to try out using device code that uses one of the the Azure IoT device SDKs. I want to understand how to send telemetry from a device, synchronize properties with the device, and control the device using commands.
# Tutorial: Create and connect a client application to your Azure IoT Central application
iot-dps How To Troubleshoot Dps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-dps/how-to-troubleshoot-dps.md
Last updated 02/14/2021
-# As an operator for Azure IoT Hub DPS, I need to know how to find out when devices are disconnecting unexpectedly and troubleshoot resolve those issues right away
+#Customer intent: As an operator for Azure IoT Hub DPS, I need to know how to find out when devices are disconnecting unexpectedly and troubleshoot resolve those issues right away.
+ # Troubleshooting with Azure IoT Hub Device Provisioning Service Connectivity issues for IoT devices can be difficult to troubleshoot because there are many possible points of failures such as attestation failures, registration failures etc. This article provides guidance on how to detect and troubleshoot device connectivity issues via [Azure Monitor](../azure-monitor/overview.md).
iot-edge How To Install Iot Edge On Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/how-to-install-iot-edge-on-windows.md
This article lists the steps to set up IoT Edge on a Windows device. These steps
* Professional, Enterprise, or Server editions * Minimum Free Memory: 2 GB * Minimum Free Disk Space: 10 GB
- * If you're creating a new deployment using Windows 10, make sure you enable Hyper-V. For more information, see how to [Install Hyper-V on Windows 10](/virtualization/hyper-v-on-windows/quick-start/enable-hyper-v.md).
- * If you're creating a new deployment using Windows Server, make sure you install Hyper-V role. For more information, see how to [Install the Hyper-V role on Windows Server](/windows-server/virtualization/hyper-v/get-started/install-the-hyper-v-role-on-windows-server.md).
+ * If you're creating a new deployment using Windows 10, make sure you enable Hyper-V. For more information, see how to [Install Hyper-V on Windows 10](/virtualization/hyper-v-on-windows/quick-start/enable-hyper-v).
+ * If you're creating a new deployment using Windows Server, make sure you install Hyper-V role. For more information, see how to [Install the Hyper-V role on Windows Server](/windows-server/virtualization/hyper-v/get-started/install-the-hyper-v-role-on-windows-server).
* If you're creating a new deployment using a VM, make sure you configure nested virtualization correctly. For more information, see the [nested virtualization](nested-virtualization.md) guide. * Access to Windows Admin Center with the Azure IoT Edge extension for Windows Admin Center installed:
iot-edge Reference Iot Edge For Linux On Windows Scripts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/reference-iot-edge-for-linux-on-windows-scripts.md
The **Deploy-Eflow** command is the main deployment method. The deployment comma
| registrationId | The registration ID of an existing IoT Edge device | Registration ID for provisioning an IoT Edge device (**X509** or **symmetric**). | | identityCertLocVm | Directory path; must be in a folder that can be owned by the `iotedge` service | Absolute destination path of the identity certificate on your virtual machine for provisioning an IoT Edge device (**X509** or **symmetric**). | | identityCertLocWin | Directory path | Absolute source path of the identity certificate in Windows for provisioning an IoT Edge device (**X509** or **symmetric**). |
-| identityPkLocVm | | Directory path; must be in a folder that can be owned by the `iotedge` service | Absolute destination path of the identity private key on your virtual machine for provisioning an IoT Edge device (**X509** or **symmetric**). |
+| identityPkLocVm | Directory path; must be in a folder that can be owned by the `iotedge` service | Absolute destination path of the identity private key on your virtual machine for provisioning an IoT Edge device (**X509** or **symmetric**). |
| identityPkLocWin | Directory path | Absolute source path of the identity private key in Windows for provisioning an IoT Edge device (**X509** or **symmetric**). | | vmSizeDefintion | No longer than 30 characters | Definition of the number of cores and available RAM for the virtual machine. **Default value**: Standard_K8S_v1. | | vmDiskSize | Between 8 GB and 256 GB | Maximum disk size of the dynamically expanding virtual hard disk. **Default value**: 16 GB. |
The **Provision-EflowVm** command adds the provisioning information for your IoT
| registrationId | The registration ID of an existing IoT Edge device | Registration ID for provisioning an IoT Edge device (**DPS**). | | identityCertLocVm | Directory path; must be in a folder that can be owned by the `iotedge` service | Absolute destination path of the identity certificate on your virtual machine for provisioning an IoT Edge device (**DPS**, **X509**). | | identityCertLocWin | Directory path | Absolute source path of the identity certificate in Windows for provisioning an IoT Edge device (**dps**, **X509**). |
-| identityPkLocVm | | Directory path; must be in a folder that can be owned by the `iotedge` service | Absolute destination path of the identity private key on your virtual machine for provisioning an IoT Edge device (**DPS**, **X509**). |
+| identityPkLocVm | Directory path; must be in a folder that can be owned by the `iotedge` service | Absolute destination path of the identity private key on your virtual machine for provisioning an IoT Edge device (**DPS**, **X509**). |
| identityPkLocWin | Directory path | Absolute source path of the identity private key in Windows for provisioning an IoT Edge device (**dps**, **X509**). | ## Get-EflowVmName
Learn how to use these commands in the following article:
* [Install Azure IoT Edge for Linux on Windows](./how-to-install-iot-edge-windows-on-windows.md)
-* Refer to [the IoT Edge for Linux on Windows PowerShell script reference](reference-iot-edge-for-linux-on-windows-scripts.md#deploy-eflow) for all the commands available through PowerShell.
+* Refer to [the IoT Edge for Linux on Windows PowerShell script reference](reference-iot-edge-for-linux-on-windows-scripts.md#deploy-eflow) for all the commands available through PowerShell.
iot-hub Iot Hub Devguide Pricing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-devguide-pricing.md
| Cloud-to-device messages | Successfully sent messages are charged in 4-KB chunks, for example a 6-KB message is charged 2 messages. | | File uploads | File transfer to Azure Storage is not metered by IoT Hub. File transfer initiation and completion messages are charged as messaged metered in 4-KB increments. For example, transferring a 10-MB file is charged as two messages in addition to the Azure Storage cost. | | Direct methods | Successful method requests are charged in 4-KB chunks, and responses are charged in 4-KB chunks as additional messages. Requests to disconnected devices are charged as messages in 4-KB chunks. For example, a method with a 4-KB body that results in a response with no body from the device is charged as two messages. A method with a 6-KB body that results in a 1-KB response from the device is charged as two messages for the request plus another message for the response. |
-| Device and module twin reads | Twin reads from the device or module and from the solution back end are charged as messages in 512-byte chunks. For example, reading a 6-KB twin is charged as 12 messages. |
-| Device and module twin updates (tags and properties) | Twin updates from the device or module and from the solution back end are charged as messages in 512-byte chunks. For example, reading a 6-KB twin is charged as 12 messages. |
-| Device and module twin queries | Queries are charged as messages depending on the result size in 512-byte chunks. |
+| Device and module twin reads | Twin reads from the device or module and from the solution back end are charged as messages in 4-KB chunks. For example, reading a 8-KB twin is charged as 2 messages. |
+| Device and module twin updates (tags and properties) | Twin updates from the device or module and from the solution back end are charged as messages in 4-KB chunks. For example, reading a 12-KB twin is charged as 3 messages. |
+| Device and module twin queries | Queries are charged as messages depending on the result size in 4-KB chunks. |
| Jobs operations <br/> (create, update, list, delete) | Not charged. | | Jobs per-device operations | Jobs operations (such as twin updates, and methods) are charged as normal. For example, a job resulting in 1000 method calls with 1-KB requests and empty-body responses is charged 1000 messages. | | Keep-alive messages | When using AMQP or MQTT protocols, messages exchanged to establish the connection and messages exchanged in the negotiation are not charged. |
iot-hub Iot Hub How To Clone https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-how-to-clone.md
Last updated 12/09/2019
-# intent: As a customer using IoT Hub, I need to clone my IoT hub to another region.
+# Customer intent: As a customer using IoT Hub, I need to clone my IoT hub to another region.
+ # How to clone an Azure IoT hub to another region This article explores ways to clone an IoT Hub and provides some questions you need to answer before you start. Here are several reasons you might want to clone an IoT hub:
iot-hub Iot Hub Troubleshoot Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-connectivity.md
Last updated 11/06/2020
-# As an operator for Azure IoT Hub, I need to know how to find out when devices are disconnecting unexpectedly and troubleshoot resolve those issues right away
+#Customer intent: As an operator for Azure IoT Hub, I need to know how to find out when devices are disconnecting unexpectedly and troubleshoot resolve those issues right away.
+ # Monitor, diagnose, and troubleshoot disconnects with Azure IoT Hub Connectivity issues for IoT devices can be difficult to troubleshoot because there are many possible points of failure. Application logic, physical networks, protocols, hardware, IoT Hub, and other cloud services can all cause problems. The ability to detect and pinpoint the source of an issue is critical. However, an IoT solution at scale could have thousands of devices, so it's not practical to check individual devices manually. IoT Hub integrates with two Azure services to help you:
iot-hub Iot Hub Troubleshoot Error 400027 Connectionforcefullyclosedonnewconnection https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-400027-connectionforcefullyclosedonnewconnection.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 400027 ConnectionForcefullyClosedOnNewConnection errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 400027 ConnectionForcefullyClosedOnNewConnection errors.
# 400027 ConnectionForcefullyClosedOnNewConnection
iot-hub Iot Hub Troubleshoot Error 401003 Iothubunauthorized https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-401003-iothubunauthorized.md
Last updated 11/06/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 401003 IoTHubUnauthorized errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 401003 IoTHubUnauthorized errors.
# 401003 IoTHubUnauthorized
iot-hub Iot Hub Troubleshoot Error 403002 Iothubquotaexceeded https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-403002-iothubquotaexceeded.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 403002 IoTHubQuotaExceeded errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 403002 IoTHubQuotaExceeded errors.
# 403002 IoTHubQuotaExceeded
iot-hub Iot Hub Troubleshoot Error 403004 Devicemaximumqueuedepthexceeded https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-403004-devicemaximumqueuedepthexceeded.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 403004 DeviceMaximumQueueDepthExceeded errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 403004 DeviceMaximumQueueDepthExceeded errors.
# 403004 DeviceMaximumQueueDepthExceeded
iot-hub Iot Hub Troubleshoot Error 403006 Devicemaximumactivefileuploadlimitexceeded https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-403006-devicemaximumactivefileuploadlimitexceeded.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 403006 DeviceMaximumActiveFileUploadLimitExceeded errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 403006 DeviceMaximumActiveFileUploadLimitExceeded errors.
# 403006 DeviceMaximumActiveFileUploadLimitExceeded
iot-hub Iot Hub Troubleshoot Error 404001 Devicenotfound https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-404001-devicenotfound.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 404001 DeviceNotFound errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 404001 DeviceNotFound errors.
# 404001 DeviceNotFound
iot-hub Iot Hub Troubleshoot Error 404103 Devicenotonline https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-404103-devicenotonline.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 404103 DeviceNotOnline errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 404103 DeviceNotOnline errors.
# 404103 DeviceNotOnline
iot-hub Iot Hub Troubleshoot Error 404104 Deviceconnectionclosedremotely https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-404104-deviceconnectionclosedremotely.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 404104 DeviceConnectionClosedRemotely errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 404104 DeviceConnectionClosedRemotely errors.
# 404104 DeviceConnectionClosedRemotely
iot-hub Iot Hub Troubleshoot Error 409001 Devicealreadyexists https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-409001-devicealreadyexists.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 409001 DeviceAlreadyExists errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 409001 DeviceAlreadyExists errors.
# 409001 DeviceAlreadyExists
iot-hub Iot Hub Troubleshoot Error 409002 Linkcreationconflict https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-409002-linkcreationconflict.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 409002 LinkCreationConflict errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 409002 LinkCreationConflict errors.
# 409002 LinkCreationConflict
iot-hub Iot Hub Troubleshoot Error 412002 Devicemessagelocklost https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-412002-devicemessagelocklost.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 412002 DeviceMessageLockLost errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 412002 DeviceMessageLockLost errors.
# 412002 DeviceMessageLockLost
iot-hub Iot Hub Troubleshoot Error 429001 Throttlingexception https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-429001-throttlingexception.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 429001 ThrottlingException errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 429001 ThrottlingException errors.
# 429001 ThrottlingException
iot-hub Iot Hub Troubleshoot Error 500Xxx Internal Errors https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-500xxx-internal-errors.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 500xxx Internal errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 500xxx Internal errors.
# 500xxx Internal errors
iot-hub Iot Hub Troubleshoot Error 503003 Partitionnotfound https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-503003-partitionnotfound.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 503003 PartitionNotFound errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 503003 PartitionNotFound errors.
# 503003 PartitionNotFound
iot-hub Iot Hub Troubleshoot Error 504101 Gatewaytimeout https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/iot-hub-troubleshoot-error-504101-gatewaytimeout.md
Last updated 01/30/2020
-# As a developer or operator for Azure IoT Hub, I want to resolve 504101 GatewayTimeout errors.
+#Customer intent: As a developer or operator for Azure IoT Hub, I want to resolve 504101 GatewayTimeout errors.
# 504101 GatewayTimeout
iot-hub Quickstart Control Device Android https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-control-device-android.md
Last updated 06/21/2019
-# As a developer new to IoT Hub, I need to use a service application written for Android to control devices connected to the hub.
+#Customer intent: As a developer new to IoT Hub, I need to use a service application written for Android to control devices connected to the hub.
# Quickstart: Control a device connected to an IoT hub (Android)
iot-hub Quickstart Control Device Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-control-device-dotnet.md
ms.devlang: csharp
Last updated 03/04/2020
-# As a developer new to IoT Hub, I need to see how to use a service application to control a device connected to the hub.
+#Customer intent: As a developer new to IoT Hub, I need to see how to use a service application to control a device connected to the hub.
# Quickstart: Control a device connected to an IoT hub (.NET)
iot-hub Quickstart Control Device Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-control-device-java.md
ms.devlang: java
Last updated 06/21/2019
-# As a developer new to IoT Hub, I need to see how to use a back-end application to control a device connected to the hub.
+#Customer intent: As a developer new to IoT Hub, I need to see how to use a back-end application to control a device connected to the hub.
# Quickstart: Control a device connected to an Azure IoT hub with Java
iot-hub Quickstart Control Device Node https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-control-device-node.md
ms.devlang: nodejs
Last updated 06/21/2019
-# As a developer new to IoT Hub, I need to see how to use a back-end application to control a device connected to the hub.
+#Customer intent: As a developer new to IoT Hub, I need to see how to use a back-end application to control a device connected to the hub.
# Quickstart: Use Node.js to control a device connected to an Azure IoT hub
iot-hub Quickstart Control Device Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-control-device-python.md
ms.devlang: python
Last updated 09/14/2020
-# As a developer new to IoT Hub, I need to see how to use a back-end application to control a device connected to the hub.
+#Customer intent: As a developer new to IoT Hub, I need to see how to use a back-end application to control a device connected to the hub.
# Quickstart: Control a device connected to an IoT hub (Python)
iot-hub Quickstart Send Telemetry Android https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-android.md
Last updated 03/15/2019
-# As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from an Android device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
+#Customer intent: As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from an Android device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
# Quickstart: Send IoT telemetry from an Android device
iot-hub Quickstart Send Telemetry C https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-c.md
Last updated 04/10/2019
-# As a C developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
+#Customer intent: As a C developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
# Quickstart: Send telemetry from a device to an IoT hub and read it with a back-end application (C)
iot-hub Quickstart Send Telemetry Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-dotnet.md
ms.devlang: csharp
Last updated 06/01/2020
-# As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a service application.
+#Customer intent: As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a service application.
# Quickstart: Send telemetry from a device to an IoT hub and read it with a service application (.NET)
iot-hub Quickstart Send Telemetry Ios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-ios.md
Last updated 04/03/2019
-# As a developer, I need to build an end-to-end IoT solution that sends telemetry from a device to an IoT hub and reads that telemetry data from the hub using a back-end application.
+#Customer intent: As a developer, I need to build an end-to-end IoT solution that sends telemetry from a device to an IoT hub and reads that telemetry data from the hub using a back-end application.
# Quickstart: Send telemetry from a device to an IoT hub (iOS)
iot-hub Quickstart Send Telemetry Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-java.md
ms.devlang: java
Last updated 01/27/2021
-# As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
+#Customer intent: As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
# Quickstart: Send telemetry to an Azure IoT hub and read it with a Java application
iot-hub Quickstart Send Telemetry Node https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-node.md
ms.devlang: nodejs
Last updated 06/21/2019
-# As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
+#Customer intent: As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
# Quickstart: Send telemetry from a device to an IoT hub and read it with a back-end application (Node.js)
iot-hub Quickstart Send Telemetry Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/quickstart-send-telemetry-python.md
ms.devlang: python
Last updated 06/16/2020
-# As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
+#Customer intent: As a developer new to IoT Hub, I need to see how IoT Hub sends telemetry from a device to an IoT hub and how to read that telemetry data from the hub using a back-end application.
# Quickstart: Send telemetry from a device to an IoT hub and read it with a back-end application (Python)
iot-hub Tutorial Connectivity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/tutorial-connectivity.md
Last updated 02/22/2019
-# As a developer, I want to know what tools I can use to verify connectivity between my IoT devices and my IoT hub.
+#Customer intent: As a developer, I want to know what tools I can use to verify connectivity between my IoT devices and my IoT hub.
# Tutorial: Use a simulated device to test connectivity with your IoT hub
iot-hub Tutorial Message Enrichments https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/tutorial-message-enrichments.md
Last updated 12/20/2019
-# intent: As a customer using Azure IoT Hub, I want to add information to the messages that come through my IoT hub and are sent to another endpoint. For example, I'd like to pass the IoT hub name to the application that reads the messages from the final endpoint, such as Azure Storage.
+# Customer intent: As a customer using Azure IoT Hub, I want to add information to the messages that come through my IoT hub and are sent to another endpoint. For example, I'd like to pass the IoT hub name to the application that reads the messages from the final endpoint, such as Azure Storage.
# Tutorial: Use Azure IoT Hub message enrichments
iot-pnp Concepts Iot Pnp Bridge https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/concepts-iot-pnp-bridge.md
-# As a solution or device builder, I want to understand what IoT Plug and Play bridge is and how I can connect existing sensors attached to a Windows or Linux PC as IoT Plug and Play devices.
+#Customer intent: As a solution or device builder, I want to understand what IoT Plug and Play bridge is and how I can connect existing sensors attached to a Windows or Linux PC as IoT Plug and Play devices.
# IoT Plug and Play bridge
iot-pnp Concepts Modeling Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/concepts-modeling-guide.md
-# As a device builder, I want to understand how to design and author a DTDL model for an IoT Plug and Play device.
+#Customer intent: As a device builder, I want to understand how to design and author a DTDL model for an IoT Plug and Play device.
iot-pnp Howto Author Pnp Bridge Adapter https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/howto-author-pnp-bridge-adapter.md
-# As a device builder, I want to understand the IoT Plug and Play bridge, learn how to build and IoT Plug and Play bridge adapter.
+#Customer intent: As a device builder, I want to understand the IoT Plug and Play bridge, learn how to build and IoT Plug and Play bridge adapter.
# Extend the IoT Plug and Play bridge The [IoT Plug and Play bridge](concepts-iot-pnp-bridge.md#iot-plug-and-play-bridge-architecture) lets you connect the existing devices attached to a gateway to your IoT hub. You use the bridge to map IoT Plug and Play interfaces to the attached devices. An IoT Plug and Play interface defines the telemetry that a device sends, the properties synchronized between the device and the cloud, and the commands that the device responds to. You can install and configure the open-source bridge application on Windows or Linux gateways. Additionally, the bridge can be run as an Azure IoT Edge runtime module.
iot-pnp Howto Build Deploy Extend Pnp Bridge https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/howto-build-deploy-extend-pnp-bridge.md
-# As a device builder, I want to understand the IoT Plug and Play bridge, learn how to extend it, and learn how to run it on IoT devices, gateways, and as an IoT Edge module.
+#Customer intent: As a device builder, I want to understand the IoT Plug and Play bridge, learn how to extend it, and learn how to run it on IoT devices, gateways, and as an IoT Edge module.
# Build and deploy the IoT Plug and Play bridge
iot-pnp Howto Use Dtdl Authoring Tools https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/howto-use-dtdl-authoring-tools.md
-# As a solution builder, I want to use a DTDL editor to author and validate DTDL model files to use in my IoT Plug and Play solution.
+#Customer intent: As a solution builder, I want to use a DTDL editor to author and validate DTDL model files to use in my IoT Plug and Play solution.
# Install and use the DTDL authoring tools
iot-pnp Howto Use Embedded C https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/howto-use-embedded-c.md
-# As a device builder, I want to know about the options for implementing IoT Plug and Play on constrained devices.
+#Customer intent: As a device builder, I want to know about the options for implementing IoT Plug and Play on constrained devices.
# Implement IoT Plug and Play on constrained devices
iot-pnp Howto Use Iot Explorer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/howto-use-iot-explorer.md
-# As a solution builder, I want to use a GUI tool to interact with IoT Plug and Play devices connected to an IoT hub to test and verify their behavior.
+#Customer intent: As a solution builder, I want to use a GUI tool to interact with IoT Plug and Play devices connected to an IoT hub to test and verify their behavior.
# Install and use Azure IoT explorer
iot-pnp Howto Use Iot Pnp Bridge https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/howto-use-iot-pnp-bridge.md
-# As a device builder, I want to see a working IoT Plug and Play device sample connecting to an IoT hub and sending properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
+#Customer intent: As a device builder, I want to see a working IoT Plug and Play device sample connecting to an IoT hub and sending properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
# How to connect an IoT Plug and Play bridge sample running on Linux or Windows to IoT Hub
iot-pnp Overview Iot Plug And Play https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/overview-iot-plug-and-play.md
-# As a device builder, I need to know what is IoT Plug and Play, so I can understand how it can help me build and market my IoT devices.
+#Customer intent: As a device builder, I need to know what is IoT Plug and Play, so I can understand how it can help me build and market my IoT devices.
# What is IoT Plug and Play?
iot-pnp Quickstart Connect Device https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/quickstart-connect-device.md
zone_pivot_groups: programming-languages-set-twenty-six
# - id: programming-language-python # Title: Python
-# As a device builder, I want to see a working IoT Plug and Play device sample connecting to IoT Hub and sending properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
+#Customer intent: As a device builder, I want to see a working IoT Plug and Play device sample connecting to IoT Hub and sending properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
# Quickstart: Connect a sample IoT Plug and Play device application running on Linux or Windows to IoT Hub
iot-pnp Quickstart Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/quickstart-service.md
zone_pivot_groups: programming-languages-set-ten
# - id: programming-language-python # Title: Python
-# As a solution builder, I want to connect to and interact with an IoT Plug and Play device that's connected to my solution. For example, to collect telemetry from the device or to control the behavior of the device.
+#Customer intent: As a solution builder, I want to connect to and interact with an IoT Plug and Play device that's connected to my solution. For example, to collect telemetry from the device or to control the behavior of the device.
# Quickstart: Interact with an IoT Plug and Play device that's connected to your solution
iot-pnp Tutorial Configure Tsi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/tutorial-configure-tsi.md
-# As an IoT solution builder, I want to historize and analyze data from my IoT Plug and Play devices by routing to Time Series Insights.
+# Customer intent: As an IoT solution builder, I want to historize and analyze data from my IoT Plug and Play devices by routing to Time Series Insights.
# Tutorial: Create and configure a Time Series Insights Gen2 environment
iot-pnp Tutorial Migrate Device To Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/tutorial-migrate-device-to-module.md
-# As a device builder, I want to learn how to implement a module that works with IoT Plug and Play.
+#Customer intent: As a device builder, I want to learn how to implement a module that works with IoT Plug and Play.
# Tutorial: Connect an IoT Plug and Play module (C#)
iot-pnp Tutorial Multiple Components https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/tutorial-multiple-components.md
zone_pivot_groups: programming-languages-set-twenty-six
# - id: programming-language-python # Title: Python
-# As a device builder, I want to see a working IoT Plug and Play device sample connecting to IoT Hub and using multiple components to send properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
+#Customer intent: As a device builder, I want to see a working IoT Plug and Play device sample connecting to IoT Hub and using multiple components to send properties and telemetry, and responding to commands. As a solution builder, I want to use a tool to view the properties, commands, and telemetry an IoT Plug and Play device reports to the IoT hub it connects to.
# Tutorial: Connect an IoT Plug and Play multiple component device applications running on Linux or Windows to IoT Hub
iot-pnp Tutorial Use Mqtt https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/tutorial-use-mqtt.md
-# As a device builder, I want to see how I can use the MQTT protocol to create an IoT Plug and Play device client without using the Azure IoT Device SDKs.
+#Customer intent: As a device builder, I want to see how I can use the MQTT protocol to create an IoT Plug and Play device client without using